Are you complying to your policies?
Don’t you hate it when you’re chugging along, minding your own business, doing what you believe to be the right things in business then whammo, an oversight catches you off guard? Take, for instance, a compliance violation that comes up during a routine audit. The auditor discovers the gap, your team is called on it, and you have to make it right then vow to never let it happen again.
This very thing can happen to you and your business if you’re not careful when it comes to compliance. The more websites and applications I test the more I realize that so many people are completely missing the spirit of the regulations they claim to be compliant with.
I don’t believe it’s intentional; but the more complex your information systems environment becomes, the greater the risk of overlooking web security issues that you can get you into a real bind with the auditors and regulators.
Let’s look at this in the context of the Health Insurance Portability and Accountability Act (HIPAA); but know that these same problems can occur with other regulations such as PCI DSS, GLBA, or even the various breach notification laws. The HIPAA Security Rule mandates controls around the following general areas, among others:
- Risk analysis
- Access controls
- Audit logging
- System monitoring
- Encryption
- Policy enforcement
- Incident response
These items alone are a lot to take on. Yet, when I see and hear how web security is being handled, many organizations are really missing the mark with web-related compliance.
No true risk analysis has been performed, especially in seemingly harmless web applications used for content management, customer service, or business workflows. The thing is, however, if sensitive information is accessible directly or indirectly through these systems, you’d better be testing them for security flaws and ensure the proper controls are in place to meet the applicable regulations.
User accounts and privileges are often exploitable because of things like poor session management and weak password requirements.
Personally-identifiable information is strewn about web and database servers and often accessible via directory traversal or improper file permissions.
SSL is often unenforced across the application. Sensitive information is stored unencrypted in the database or on file shares.
Security policies say one thing yet they’re either ignored or unknown to users.
Incident response is treated as “We’ll figure out what we need to do if the website is ever hacked.”
All of these are common security gaps yet auditors who are oblivious to the inner workings of their web environment will tell management time and again that everything is “compliant”. Compliant with what? Once management hears this, they’re claim that they have taken the necessary steps to prevent a web security breach. I don’t think so.