As compared to last year, are organisations now becoming more aware about where their data is residing?
Organisations are clearly lacking in terms of the data identification. Because of the dynamic environment in organisations, people need to be very aware in terms of changes that keep on happening there. With every change, there is a lot of scope for the potential breach that can happen. They may attain a particular stature and identify particular data elements, but just because there is change driven either by business or by IT, organisations need to be clear in terms of impact of that change — what data elements it will bring in, which data elements it will change, which new threat avenues it will open. For example, an internal portal becoming an external portal— does it open up any privacy related issues, has user consent been taken? Maintaining compliance is really difficult in IT and BPO. Organisations have to align their internal change with the ever changing compliance landscape.
Is ignoring log files still a very common security weakness?
Log correlation remains very important, but organisations need to have a global perspective and look at the entire horizon to really ascertain its security stature. Internal fraud detection capabilities have really helped. However, presently, fraud detection is becoming less important in terms of the breach discovery, law enforcement and discovery by unrelated parties; i.e. fraud by third party is rising much faster. Thus, it goes to show that it is not only important to look at one’s own logs, but also to take intelligence input from the industry group or share the data with CERT, to be aware of the global risk intelligence.
In spite of security systems in place, is proper monitoring a big issue?
It continues to be the challenge, as it needs continuous focus. In some of the big investigations we did, if the system is sending logging to a particular log correlation system, even in that system, the IP address of the forwarding system is wrong. People should not have a false sense of security just because they have implemented a log correlation engine. How strong the integration is what matters and how strong the network flow reading is. It is not about the end-point log, but also about network flow information, which can tell a lot about APTs. These kinds of threats do not generate much noise—unless you look into what is happening in and outside the network.
Where do organisations start integration of solutions and processes?
Log correlation and SIEM solutions are very important. It depends on how the solution has been integrated and configured. It all depends on what is the value organisations are deriving from the solution—in terms of signature rules that have been implemented. Organisations need to see if they are really integrating it with the entire landscape and with the intrusion prevention system. Thus, if a particular threat is detected, immediate preventive action can be taken. They also need to look if it is integrated with the anti-virus system and with network flow sensors.
How can organisations make big data more consumable for risk management officers?
Organisations need to be very clear with data identification and classification, before they even start their journey on big data. They need to be very careful about what that data is, where the data is coming from, where it is being sent and how the data is being protected in all three areas: data addressed, data in transit, and data under processing.
Data classification is very important because organisations need to be sure about what kind of data elements are being gathered. It could be card holder data where PCI compliance comes in; it could be health records so HIPAA comes in; it could be personally identifiable information where a lot of state laws or EU directives come into play; it could be personal information where 11 amendments of IT Act talk about personal information. They also have to look at how the data is securely destroyed. Thus, organisations need to be aware of the complete data lifecycle. They need to put in a clear security matrix when they handle big data areas.
Is it easy to breach air-gapped systems?
With increase in automation and in human-machine interface, these systems are becoming more connected to the TCP/IP systems. The more they get connected, the more they become vulnerable. These systems need to be assessed. Risk assessment is the only way to risk profile these kinds of infrastructure and deployments. Organisations need to be clear in terms of network architecture, connectivity, third-party software and third-party management.