Some interesting webinars on modern security practices, including tokenization and federated SSO strong factor AuthN.Tokenization is the process of replacing sensitive data in systems with inert data that has no value to an information thief. This can be CC numbers, but really can be any data, user ids, SSNs, PHI, anything. The token resembles the original data type and length. Tokenization, especially internal, can help with organizations that need to store many different records in different systems, which perhaps mean that the key infrastructure required to enable encryption puts too great a burden on the security framework.
Internal tokenization is easier to implement, solves urgent problems, and only involves the host organization. The other option is xxternal tokenization involves third parties, and solves big problems. Merchants are going to be more interested in external, thus reducing scope by the greatest amount, whereas an insurance company needs to maintain PHI, so will probably want to look after an internal tokenization solution. The internal should remove lock-in.
There are single use and multi-use tokens. VISA has a [spec](http://usa.visa.com/download/merchants/tokenization_best_practices.pdf) for single-use, which are based on randomness (code-books), as opposed to encryption methods (cipher with algorithm and key). Single-use relaxes constraints on how the transaction is stored, whereas multi-use needs persistence of the token relationship. One should think about how the token is generated, i.e., mathematically connected to the data, or random, which is a tie-in to a table. TCO seems to be better for token based approaches than pure encryption, reducing PCI scope requirements. Another concern is to consider distributed vs. centralized token infrastructure. With a distributed organization, you will need to worry about synchronization, which brings with it a latency constraint and a 24×7 constraint, which may lead to token collisions, for the random token approach. However, distribution can scale to 1000s of token servers. There is a tension between whether encryption is better for the distribution, and tokenization within the DC.
PCI Council created two taskforces, one for tokenization, and one for scoping, which are meant to report guidelines for each domain. These are expected next year. One point that was the value of this approach is only fully realised when there is a breach. This is insurance, but the costs of not being insured can be very high. Sony, with its recent breach of PII, is estimating costs of $1.5B. Hope is *not* a risk mitigation strategy.
Federated SSO and 2 Factor Auth is an important development in cloud based infrastructure. Identity silos are the results of growth out of directories, growing in terms of scope, to include assertions, and directory structures that are too brittle for cloud. Tight identity coupling strain ITs ability to scale when it comes to security, privacy, auth copies and business agility. The use of multiple directories or silos leaves latency for synchronization across silos and the need to push credentials to multiple locations, which is a problem for audit compliance.
To overcome, there was a need to create an architecture that separates the source of identity information away from its use. A digital subject wields a credential as an identity entity, and an identity attesting entity, say an IdP in SAML. Then there is the identity-relying entity, an SP in SAML. Identity claims pass around each of these nodes. This leads towards the goal of identity statelessness – outsourcing AuthN and attributes to authoritative sources. This leads to JIT IdM, caching rather than replication. This isolates audit to the authoritative source.
SAML makes the security context portable. SAML has operational modes, profiles, protocols, assertions and bindings as layers of its framework. SAML assertions are the way to pass identity claims. So, the elements are saml:Issuer, ds:Signature, saml:Subject, saml:Conditions, saml:Advice and saml:Statements. These statements capture what type of Auth was performed. This allows logic to determine if policy was met. SAML lets enterprises apply strong auth as needed by policy, even in cloud mashup and mobile contexts. Classic federated identity differs from cloud-centric federated identity, since cloud can have both the IdP and SP both using and authenticating claims.
Adaptive strong Auth leads to pattern recognition everywhere. The recent RSA breach marks the end of HW token reliance in any sensible organization. A way to overcome some design problems, there is an opportunity to fuse consumer, two-factor authenticaiton and SSO. Risk-based authN with sw tokens provides for removal of cost and complexity of hw tokens, provides for centralized access control, and provision for mobile consumer equipment. A way to think of the various options is ease of use vs cost of ownership/security. Adaptive authN allows one to move across the cost dimension as context suggests, for additional authN actions. For comprehensive identification of users, a layered approach is needed, meaning that identity resolution starts with identity proofing and vetting, leading to adaptive and risk-based authN, leading to enterprise fraud management.