October 26, 2014
Hot Topics:
RSS RSS feed Download our iPhone app

Ten Aspects of Security to Improve Application Strength

  • March 6, 2006
  • By Chad Cook
  • Send Email »
  • More Articles »

Encryption

As with most security technologies, it is easy to render the security strength ineffective if used improperly. Encryption is an important component in most security architectures, but there are some complexities involved with its use. The following pieces need to be considered when using encryption:

  1. The type of algorithm: symmetric or asymmetric
  2. The generation of keys and keying material
  3. The management of keys and keying material
  4. The application's performance requirements

There are two types of encryption algorithms: asymmetric and symmetric. Asymmetric algorithms have separate keys for encryption and decryption, use longer key lengths, have slower performance, and have easier key distribution with somewhat difficult key management. Symmetric algorithms use a single key for encryption and decryption, typically have smaller key lengths, higher performance, and more complex key distribution with simpler key management. Choosing a type and algorithm is often done automatically depending on the technologies used. Unless implementing a new or proprietary protocol, the use of encryption is generally governed by an existing standard.

When using encryption, there are several pieces of information that affect its strength. This includes how the keys are generated and the sources of data used to start key generation. Initialization vectors used to seed keying material or encryption operations often rely on sources of randomness and entropy. What these sources are and how truly random they are can result in weaker keys and encryption that allows attackers to compromise the data. Often, the most reliable sources of randomness are hardware and many systems have built-in pseudo-random number generators (PRNG).

Once keys are generated, the next challenge is their management, which includes storage, distribution, revocation, and updates. The problem of key storage is fairly obvious, with the plethora of smart cards, USB tokens, and dongles readily available. Because the key used for decryption must be stored safely, the use of encryption can become useless if private keys are stored on globally accessible media, such as on a hard drive. The fun does not stop there, however. For symmetric encryption, where a single key is used for both encryption and decryption, that key must somehow be communicated to the parties that want to encrypt data. It is obviously not ideal to simply transmit the key via some common way (e-mail, a file, Instant Messaging, and so forth), because it can be compromised easily. For this reason, several schemes are used to exchange keys, including Diffie-Hellman key exchange, the use of Public Key cryptography to exchange keys, and other algorithms that allow for mutual calculation of keys. Finally, the management of keys can become tricky, especially for keys that are compromised or expire when used on data that persists for long periods of time. If keys are lost, stolen, or expired, new keys must be generated and the data that may have been encrypted with the old keys must then be decrypted and re-encrypted using the new keys.

Finally, the performance requirements of the application can affect which encryption technologies are used. Public key algorithms are computationally expensive and will result in dramatic decreases in performance and throughput. Symmetric ciphers have a higher degree of performance. A balance of the two can often be used—SSL/TLS is a perfect example. SSL/TLS most commonly runs in a mode that utilizes both public key and shared key encryption algorithms. The public key algorithm is used to exchange the shared key in a safe manner, and this shared key then is used for all of the bulk encryption operations that follow the session. It is also important to note that different algorithms perform differently.

If performance is a factor for applications, the careful selection of encryption types and algorithms allows the application to meet performance requirements. Alternately, there are hardware components available to offload and increase performance of various encryption operations. These components come in many form factors and can be utilized by the application software.

Segregation of Data and Privileges

In conjunction with access control (above), how data is organized within the application can add or detract from its security. This flaw manifests itself when the application has no central mechanism for accessing sensitive data and when the management of sensitive data is distributed across many components and modules within the application. This distribution of sensitive data makes access control very difficult. It also allows for multiple points of vulnerability—each module or component that manages sensitive data becomes a potential target.

Applications that work with sensitive information can improve security by segregating access and management of that data to a protected module. This module can implement granular access controls and auditing functionality to assure access is granted only to those components that need it. Centralizing access to sensitive data also allows the application designer to more cleanly and logically organize functionality that may otherwise end up unmanageable when it is distributed across too many components.

The second aspect to isolation and segregation within applications concerns privilege. Most systems have some notion of privilege levels, especially when doing development on Windows and UNIX platforms—and many operations require elevated privileges to execute. Common mistakes in this area are:

  1. Running the application in an elevated privilege level at all times (as the "root" user on UNIX or as one of the administrative accounts on Windows).
  2. Failure to determine which actions truly require elevated privileges. Most operations within an application can be done without the need for elevated privileges.

One solution often used today is the separation of privileged actions. There are a few ways to accomplish this:

  1. If possible, isolate privileged operations to initialization time to start with elevated privileges; then drop them once they are complete.
  2. If elevated privileges are required at various points during the normal operation of an application, the privileged operations should be isolated from the non-privileged operations. Separate threads or processes then can be used to isolate and execute operations. Combined with the appropriate authentication and access control (above), privileged operations can be implemented and managed safely.

In short, it is vital to limit privileged functionality to a minimum and to keep sensitive data managed. Designing applications with this in mind can reduce the potential vulnerability and allow for better auditing within the application to determine when a problem has occurred.

Error Handling

Error handling is both an engineering and security challenge; how an application behaves individually and in conjunction with other applications can result in stronger or weaker security. An application that has defined constraints and safely handles error conditions can mean the difference between a resilient environment and one that crashes completely.

Error handling is an aspect of application design that is often forgotten or ignored. If a framework for handling errors is not designed, an application may end up with a different scheme for each developer who works on the project. Components of error handling can include:

  1. Definition of error types: processing, runtime, security violations, program bugs/assertions.
  2. Definition of severity levels: informational, warnings, catastrophic issues.
  3. Logging and auditing: centralized auditing with modular logging components such as file, e-mail, and cell phone.
  4. Defined responses to error types and levels: issue a warning, stop a component, restart a service, shut down the application, notify an administrator.

The design of an application's error handling capabilities is further strengthened when the functional requirements include negative scenarios; see the section on thinking like the enemy, above. Providing a framework for error handling creates more predictable and easily used applications. It resolves any difficulties that arise from individual developer styles and forces the designer to outline the constraints and limits within the application. A framework for error handling also provides greater clarity and understanding when anomalies do occur.

Testing for Security

One final aspect that people often fail to recognize is the importance of testing as it relates to security. Three components of testing for security are:

  1. Functional verification
  2. Constraint evaluation
  3. Border cases and attacks

Functional verification is the most prevalent form of testing and is most often what is meant when people refer to testing applications. Just as one creates tests for their functional requirements, it is equally important to devise tests for those requirements established for undesired behavior and error conditions—the constraints and limits of the application. Each of the conditions and responses defined for the behavior of the application should be rung out and verified.

Border tests can be thought of in the following manner:

  1. Testing interfaces to the application: APIs, network, and user—the physical borders.
  2. Testing the borders of data processing: Multiple combinations of imperfect data that will be processed by the application. These can include valid ranges of data values, unexpected types, and randomly generated data.

There are a number of tools available to test an application's security. These include code analyzers, modeling tools, and stress testing tools such as fuzzers, which help identify many software vulnerabilities and coding issues.

The combination of testing that includes defined functionality and constraint evaluation, as well as stress testing that pushes the limits and boundaries of an application, help improve the security before it becomes deployed in critical environments.

Summary

Security in and of an application does not have to be an overwhelming task. By considering the security aspects of an application at all stages, as early as basic functional requirements, one can weave security into all areas of the application; doing so results in a cumulative and strong level of security strength, resiliency, and quality. Effort spent during the design phases looking at the various layers of the application and how one hopes people will and will not use the application sets the foundation for functionality that both meets the needs of users and withstands all anomalies that occur.

Proper application of security technologies such as authentication, access control, and encryption can assure a higher degree of safety in untrustworthy environments. An understanding of worst-case scenario events, handling errors, and compartmentalizing sensitive data and operations within the application can solidify the strength of an application. Finally, taking some time to test and verify constraints, functionality, and security will assure a level of confidence with the developers and users alike. Keep security in mind to create useful—and safe—applications.

About the Author

Chad Cook has spent over a dozen years in Information Security that include both product engineering and IT services. Chad has developed IT service security strategies, networks, and policies for organizations including BBN and GTE Internetworking, Infolibria, and the international security consulting firm, @stake/Symantec. He has architected and developed security technology for award-winning networking, security, and financial products sold worldwide, including core routers, edge devices, utility hosting systems, and Web services security devices and high-performance systems. Chad has nine patents applied for and pending on security analysis, modeling techniques, and encryption processing acceleration.

Currently, Chad is the VP of Information Security at Lime Group, a New York securities and brokerage organization, where he leads product architecture, infrastructure security, and compliance efforts. Prior to Lime Group, he designed and developed security risk management and threat modeling products as CTO at Black Dragon Software. Chad has held lead engineering and security positions developing products at BBN, GTE, and a number of small companies. Chad is an internationally published author on security topics having contributed to two books, Maximum Security, 3rd and 4th editions, has been featured in numerous articles and also has written articles for Symantec's SecurityFocus.com and numerous online publications.

A frequent speaker, Chad has spoken at NATO and United Nations forums on security, numerous conferences, analyst's events, and security summits.





Page 2 of 2



Comment and Contribute

 


(Maximum characters: 1200). You have characters left.

 

 


Sitemap | Contact Us

Rocket Fuel