Secure Managed File Transfer: On Premise v’s The Cloud

Secure Managed File Transfer: On Premise v’s The Cloud

Everybody is talking about the cloud; its today’s hot topic with more and more organisations considering a cloud-base (hosted) solution as an alternative to their current on-premise solution.  The shift to cloud based computing is gathering pace and consequently this is an area we’ve been looking at quite closely.

So, is Cloud based secure managed file transfer for me and what are the biggest drivers behind this trend?

1.  Its cheaper! Many IT departments spend at least 50% of their budgets on salaries, and up to 70% of IT staff time is spent on maintenance, according to analysts. In-house IT specialists cost companies for IT management resource. A hosted service, on the other hand, may charge a much-reduced figure for its service along with 24–7–365 monitoring and higher uptime than many companies can achieve with on-premise staff and systems.

Managed_File_Transfer_In_The_Cloud

2.  Hosted providers can do it better. Hosting vendors store the information on their own servers and manage the entire system for you, drastically reducing the time and energy you spend on keeping your MFT up and running. A growing number of companies just want MFT isolated as an enterprise-class cloud service, with all the modern archiving, compliance and virus protection features they require along with a scalable infrastructure their IT staff never has to worry about or manage.

3. The cloud has gone mainstream. Primed for enormous growth and widespread adoption, recent research indicates that 84 percent of small and mid-size companies and 69 percent of large companies are willing to consider, currently reviewing or already using software-as-service (SaaS) solutions. A big part of this growth is a result of the increase in broadband Internet access, but another key factor is that cloud MFT vendors are making better, simpler and more affordable software that doesn’t require a technical degree to setup or use. It’s also more widely accepted as a safe alternative to on-premise solutions.

4.  Pay as you go. As budgets tighten in this tougher economic period, more and more companies are gravitating toward cloud-based solutions. With no technology to maintain, total cost of ownership is five to 10 times less than installed software, so it’s easier to budget and scale as you add and subtract users. In addition, cloud-based solutions do not require ongoing maintenance, time or complex upgrades, so what was once a capital expense becomes a more balance sheet-friendly operating expense.

As this shift to cloud based computing continues to gather pace, Pro2col is at the forefront of assessing the industries leading vendors to ensure we know which solution is right for your budget and set of requirements.

But, the Cloud isn’t for Everyone

Despite all this optimism for the cloud, we know there are plenty of situations where it may not make sense to move your MFT there. Some data may need to remain on-premise, behind a firewall for legal or regulatory considerations (e.g., HIPAA). Also, other on-premise applications (e.g., document workflows) may be tightly integrated with your on-premise MFT system, so moving your MFT to the cloud could pose challenges if you are hoping to continue coupling these solutions. Finally, many organisations may not have fully made use of their existing on-premise MFT solutions (i.e., they have already invested in it) and may not be able to easily or practically abandon it.

For independent advice on Cloud/Hosted FTP or On-Premise Managed File Transfer solutions contact Pro2col on +44 (0) 333 123 1240 or +44 (0) 1202 433 415.

Ipswitch MOVEit DMZ Managed File Transfer Review

Ipswitch MOVEit DMZ Managed File Transfer Review

scmag_moveitdmz_review (1)SC Magazine have reviewed a number of managed file transfer solutions available in the marketplace – Ipswitch MOVEit DMZ being one of them. This managed file transfer server software helps secure data in transit by encrypting various transfer protocols using industry standards.

After reviewing product attributes such as features, ease of use, performance, documentation, support and value for money, Ipswitch MOVEit DMZ was award a full 5 stars in every category and labelled as one of SC Magazine’s ‘Best Buys’. Stating no possible negative points against the solution, the overall verdict deemed MOVEit DMZ a “A flexible, web-based product which allows tight control over end-to-end file transfer security.”

See here for full details of the SC Magazine review or for more information regarding the Ipswitch File Transfer product range. Please also feel free to contact Pro2col on 0333 123 1240 to speak to one of our consultants.

How will the changes to PCI DSS affect you?

How will the changes to PCI DSS affect you?

credibility_pci-logoThe PCI Security Standards Council have just released version 2.0 of PCI DSS, the Data Security Standard enforced upon all merchants that accept any form of card payments, designed to secure and protect cardholder details.  Although introducing only minor alterations, the main intention of the amendment is to provide greater clarity and flexibility for small merchants, facilitating a more comprehensive understanding of the requirements that must be satisfied under PCI DSS and making them easier to implement and abide by.

From a long term perspective, the amendments made are designed to help merchants manage evolving risks and data security threats whilst maintaining alignment with industry best practices.  Taking a higher level perspective, the main changes cover:

  • Reinforcement of the need to conduct thorough scoping exercises, so that merchants can identify exactly where their cardholder data resides in the business.
  • The need for more effective log management of credit card data within the business.
  • Allowance for organisations to adopt a more risk based approach when prioritising vulnerabilities, taking into account their specific circumstances.
  • The acceptance of unique business environments and accommodation of their specific needs.

More specifically Jonathan Lampe, VP of Product Management at Ipswitch File Transfer and representative of the PCI Security Council has identified the 5 key changes that will directly effect the transfer of sensitive credit card data:

  • Explicit recognition of SFTP  as a secure protocol.
  • Audit of virtual machine infrastructure and virtualisation hypervisors will be brought within the scope of PCI DSS.
  • Rotation requirements for the purposes of key management will be “based on industry best practices and guidelines” rather than an annual stipulation.
  • Identity and authentication requirements for users, “non-consumers” and administrators will be split further.
  • More specific requirements will be implemented around the auditability and security of timekeeping, especially as recorded in audit logs.  (Coordinated and reliable timestamps are helpful during civil and criminal investigations as well as internal forensics investigations.)

A further step taken by the PCI council to help small merchants achieve the latest 2.0 PCI DSS changes is the introduction of a small microsite.  The implementation life-cycle the of PCI Council’s standards will be extended from the current 2 years to 3 years to give merchants plenty of time to make the necessary changes.  The new 2.0 standard will be effective from 1st January 2011, however validation against the previous 1.2.1 standard will be allowed until 31st December 2011.

For more information regarding PCI DSS compliance and how this can be achieve in terms of secure file transfer, please don’t hesitate to contact the team at Pro2col on 0333 123 1240.

PCI DSS 2.0 Makes for Smarter Data Transfer Security

PCI DSS 2.0 Makes for Smarter Data Transfer Security

Tuesday, October 19, 2010 – Ipswitch File Transfer, Inc., an innovator of secure, managed file transfer solutions, today identified five key changes to the Payment Card Industry Data Security Standard (PCI DSS 2.0) standard that will substantially affect businesses transferring sensitive credit card data.  The final draft of the standard will be released on October 28. However, the substance of many changes is now clear, whilst working groups on emerging technologies continue to report on forthcoming inclusions in the standard.

“The impending changes reflect developments in technology, the cost pressures on businesses and the development of smart, accepted practices,” explained Jonathan Lampe, VP of Product Management at Ipswitch and representative on the PCI Community Council. “Around fifty of our customers, from all over the world, are represented on the council.  The emphasis has been on identifying what’s secure and what works best.”

Key changes forthcoming in PCI DSS 2.0, that will impact on the transfer of sensitive data include:

  • Explicit recognition of SFTP  as a secure protocol
  • Audit of virtual machine infrastructure and virtualisation hypervisors will be brought within the scope of PCI DSS.
  • Rotation requirements for the purposes of key management will be “based on industry best practices and guidelines” rather than an annual stipulation.
  • Identity and authentication requirements for users, “non-consumers” and administrators will be split further.
  • More specific requirements will be implemented around the auditability and security of timekeeping, especially as recorded in audit logs.  (Coordinated and reliable timestamps are helpful during civil and criminal investigations as well as internal forensics investigations.)

In addition, Lampe identifies the expected incorporation of tokenization technologies, into official PCI s guidance as a key security and cost saving development.

“Tokenization – the use of data tokens in place of sensitive data such as PAN – is essentially a cost saving measure,” Lampe continued.  “Early adopters are shrinking the costs of PCI compliance by handing responsibility for their most sensitive information to a trusted custodian, saving them the expense of treating every interaction as top secret.   Tokenization is already accepted by Visa and is the focus of a current PCI Council committee; the next logical step is for it to be incorporated into official PCI guidance.”

To find out more about PCI DSS compliant managed file transfer solutions, please contact Pro2col on 0333 123 1240.

Data: Transferring the Burden Under PCI DSS

Data: Transferring the Burden Under PCI DSS

GT News have just published a great article written by Jonathan Lampe (Vice President of Product Management at Ipswitch) regarding data transfer requirements under PCI DSS.  If anyone is looking for a PCI DSS compliant solution for file transferring data, these are the points they really need to be taking into consideration:

Data: Transferring the Burden Under PCI DSS

Jonathan Lampe, Ipswitch – 08 Jun 2010

Despite widespread adoption of Simple Object Access Protocol (SOAP) and transaction sets in the financial industry, a surprising high percentage of the data flow is still represented by files or bulk data sets. In 2009, Gartner determined that bulk data transfers comprise around 80% of all traffic. This is probably a surprise if your company is among the many with millions invested in just managing individual transactions – but there are good management and security reasons for this continuing situation.

Why is File Transfer Still Common?

Financial institutions and item processors are still ‘FTP’ing’ (file transfer protocol), emailing, or sending and sharing files instead of transactions for a number of reasons. First, it helps hide the complexity of systems on both ends – there is no reliance and concern regarding libraries of transactions and responses related to one system and a different set related to another system. Second, it reduces the risk of transmission failure and makes it less risky for employees to send a small number of files or bulk data sets rather than a large number of transactions. Finally, it also increases the reliability of an overall operation.

The Managed File Transfer Industry

The managed file transfer (MFT) industry is comprised of providers whose solutions manage and protect these bulk data sets as they move between partners, business areas and locations. Collectively they address challenges presented by bulk data transfers and principles-based rules of the sort that have become common over the past few years – for example the Data Protection Principles or International Financial Reporting Standards (IFRS). Fundamentally, rules that tend to embody real-world outcomes as a standard. So, for example, the reported outcomes of penetration testing depend for certification as much upon the experience of the tester (who may be an employee) as upon the integrity of the network. This is all fine – until your network meets the real world. Principles-based rules tend to put the onus squarely on us to make and maintain systems.

For consumers, consultants and Payment Card Industry (PCI) assessors, this is undoubtedly ‘a good thing’. For those handling card data, the costs of validated and effective compliance represent a potentially significant burden that’s worth passing on to an industry that has quietly got on with the job well before buzzwords, such as ‘cloudsourcing’ or even ‘outsourcing’, entered the lexicon.

Vendors and Technologies Need Evaluation

It therefore makes a great deal of sense to place as much of that onus, and indeed risk and potential liability, on the shoulders of others – suppliers and consultants – as we can. Although PCI Data Security Standard (PCI DSS) can, and does, descend into tick-box detailed level rules in some places – which it makes very good sense to sign off to trusted third parties – nevertheless, significant ongoing parts of our obligations under PCI DSS are essentially management issues. Despite subjective components and PCI requirements to take ongoing account of best practices, the technologies themselves can still be evaluated on a relatively straightforward mechanistic basis, provided that they are submitted to sufficient scrutiny.

At the most basic level, subjective terms such as ‘adequate’ or ‘insecure’ are sometimes to be understood (explicitly or otherwise) as denoting specific technologies or other standards in line with industry best practice and are, therefore, a route to initially evaluating software on a tick-box basis.

Beyond Ticking Boxes – Four Initial Considerations

When evaluating for data security technology in the context of regulated activities, you should look at how four categories – confidentiality, integrity, availability, and auditing – contribute to security and compliance. These headline considerations are designed to assist in assessing whether a data technology or process is likely to provide one-time compliance for the purposes of PCI DSS.

Confidentiality ensures that information can be accessed only by authorised individuals and for approved purposes. For the purposes of PCI DSS this means that employees should have the minimum level of access necessary to do their job. Confidentiality begins with authentication of login credentials on every secure application and starts with putting a strong password policy in place, with robust account expiry procedures and password management.

Integrity, as repeatedly addressed in PCI DSS rules 10, 11 and 12, is relatively under-appreciated and understood solely as a security issue, but is a critical component to compliance. It means ensuring the uncompromised delivery of data, with full Secure Hash Algorithm (SHA)-512 support. In the case of file transfer operations, non-repudiation takes data security to the highest level currently available by adding digital certificate management to secure delivery and data encryption beyond the requirements of PCI DSS. The setting up of alerts is a relatively easy goal – a box ticked on the route to compliance.

Availability is not explicitly addressed in PCI standards but is a critical component of any overall security strategy. It can and should be addressed, if not guaranteed, through load balancing and clustering architectures that support automatic failover and centralised configuration data storage to minimise the chance of a data breach.

Auditing capabilities should be demonstrated by vendors in the form of comprehensive logging and log viewing with tamper evident measures to guarantee the integrity of log files. For technology, security, and other auditing purposes, all client/server interactions and administrative actions should be logged.

The Hitchhiker’s Guide to File Transfer in the PCI DSS Galaxy

The main body of the PCI DSS is divided into 12 requirements.pci_dss

Section 1 establishes firewall and router configuration standards by requiring all managed file transfer (MFT) vendors to build a product architecture that puts a proxy, gateway or tiered application into a demilitarised zone (DMZ) network segment. This requirement also puts the actual storage of data and any workflows associated with it into internal networks.

The best architectural implementations ensure that no transfer connections are ever initiated from the DMZ network segment to the internal network. Typically this is accomplished using a pool of proprietary, internally established connections. In this way, clients can connect using FTP Secure (FTPS), Secure File Transfer Protocol (SFTP), etc to the DMZ-deployed device, but the transfers involving internal resources are handled between DMZ- and internally-deployed vendor devices by the proprietary protocol.

Section 2 demands that no default or backdoor passwords remain on the system and that systems are hardened. These best practices are generally enforceable with MFT technology, but the best implementations include a hardening utility that also extends protection to the operating system on which the MFT software runs.

Section 3, particularly subsection 3.4, covers encryption of data and storage of keys. To address these issues MFT vendors have an array of synchronous and asynchronous encryption technologies, such as OpenPGP, to ensure data is secured at rest. Cryptography is almost always performed using Federal Information Processing Standards (FIPS)-validated modules and secure overwrite of data is commonly used.

Section 4 covers encryption of data in motion. All MFT vendors currently support multiple open technologies such as Secure Socket Layer (SSL), Secure Shell (SSH) and Secure/Multipurpose Internet Mail Extensions (SMIME) in multiple open protocols, including SFTP, FTPS and Applicability Statement 2 (AS2), to provide this protection.

Section 5 ensures anti-virus (AV) protection is in place for systems and the data that passes through them. Most MFT vendors provide the ability to provide both types of protection with their software. The best allow integration with existing AV implementations and security event and incident management (SEIM) infrastructure.

Section 6 requires secure systems and applications. Most MFT vendors conform to the guidelines here, particularly subsection 6.5 on web application security. However, there are large variations on fidelity to subsection 6.6 in the industry. The best vendors use a battery of security assessment and penetration tools, such as HP WebInspect and protocol fuzzers, to ensure that their software exceeds PCI security requirements – and remains that way from release to release. The best vendors also have multiple security experts working with developers to ensure new features are secure by design. These attributes are not always easy to find on a vendor’s website, but they are critical to the long-term viability of an MFT application – be sure to ask.

Sections 7 and 8 cover the establishment of identity and authority. MFT solutions typically have built-in features that cover these issues from multifactor authentication to sharing of accounts. However, there are two common areas of difference between MFT vendors in these sections. The first is the ability to rapidly ‘de-provision’ users (i.e. disable or delete the account upon termination). The second is the proper storage of passwords: some vendors still use unkeyed hashes or weak Message-Digest algorithm 5 (MD5) hashes, both of which are susceptible to either rainbow table or collision attacks.

Section 9 is about physical access and is one that many software vendors erroneously ignore. However, subsection 9.5 is about off-site backups and is a function that MFT software often provides. One advantage of using an MFT solution for this purpose is that all the security benefits from the MFT solution flow into the backup process as well.

Section 10 is about auditing and visibility into data. MFT vendors also typically have a strong story around these attributes. Common features of MFT include visibility into the full ‘life cycle’ of files, aggregate reporting, detailed logging of every administrative action, and enforcement of specific service level agreements (SLAs). Some MFT solutions also ensure that audit logs and transfer integrity information are tamper-evident to ensure complete non-repudiation of data delivery.

Section 11 is about regular testing of systems and processes. As mentioned above, MFT vendors who perform these types of tests on their own solutions before releasing their software to the public should be sought out and preferred by companies that must adhere to PCI DSS.

Section 12 is about maintaining and enforcing a security policy down to the level of end user training. Like section 9, section 12 is another section many software providers erroneously ignore. However, the best MFT vendors know that providing fingertip reporting and good user experience to both administrators and end users can go a long way toward encouraging proper use of technology.

PCI DSS Appendices A (‘Additional PCI DSS Requirements for Shared Hosting Providers’) and E (‘Attestation of Compliance – Service Providers’) are also often used when managed file transfer services through virtual area network (VAN), software-as-a-service (SaaS), hosted or cloud providers are used. Key requirements here include ensuring that the service provider is not allowing shared users, that different organisations can only see their own logs and that the provider has policies that provide for a timely forensics investigation in the event of a compromise.

Summary

The substance of the PCI burden is an ongoing one. To look down the list of PCI requirements is to scan a list of enjoinders to ‘maintain’, ‘monitor’ and ‘ensure’, that echo the ‘manage, monitor and secure’ objectives of basic FTP technology. However, and, as the March 2008 Hannaford data breach shows, it is possible to be ostensibly compliant – to have ticked all the boxes – and yet not be fully secure.

PCI DSS compliance requires organisations to protect the security, privacy, and confidentiality of information – and to document who accesses the information and the security measures taken to prevent theft, loss, or accidental disclosure.

Get in touch with one of the team for further information on the range of products by Ipswitch File Transfer on 0333 123 1240 or [email protected]

Positive results for Pro2col and co-exhibitors at Infosecurity

Positive results for Pro2col and co-exhibitors at Infosecurity

We made the decision to attend Infosecurity for the first time this year, with the intent of affirming Pro2col’s position as the UK’s leading supplier and integrator of secure file transfer technologies, with a range of carefully selected products designed to meet the requirements of any business.  Spurred by the formation of partnerships with some of the world’s leading secure file transfer vendors including Aspera, Ipswitch, Data Expedition, Biscom and Stonebranch, we were fortunate enough have experts from two vendors on the Pro2col stand, ready to impart their extensive product knowledge to attendees from around the world.

In customary form, after spending months meticulously planning for Infosec, the days leading up to the show were a little unsettling for us.  With not one but two co-exhibitors traveling from the US to London, nature decided that the pressure of event organisation was not enough and kindly added a humongous ash cloud to the mix – leaving us wondering whether or not half of our stand would actually make the event!

infosec_stand_banner_R88Despite initial concerns over travel arrangements (everyone made it thankfully – even if a little jet lagged), we are excited to say that the show was a great success for all parties involved.  With over 10 years experience within the file transfer arena, we can empathise with how daunting the broad spectrum of solutions in this marketplace can be for businesses when sourcing the most suitable solution for their requirements.  Both resellers and end users alike were very receptive to the impartial file advice and product demonstrations offered by Pro2col representatives, but also pleased to benefit from specialist product information imparted by Jon Laughland – UK Sale Executive for Stonebranch and Charlie Magliato – Channel Manager for Biscom Delivery Server.

From our perspective, it was brilliant to see just how seriously companies are taking the security of their sensitive data.  We spoke to IT professionals from a wide range of market sectors from the public domain (government bodies, healthcare organisations, universities), to retail, publishing, banking, legal firms – the list is endless!  Although unable to give each visitor the time allocated in a typical demonstration or consultation, we were able to glean valuable insight regarding the way businesses are currently moving their sensitive data and provide a neutral recommendation for products to meet their operational needs.

Another factor that surfaced repeatedly during the event, was the financial investment associated with some secure file transfer solutions.  There’s an abundance of smaller companies out there with a requirement to transfer files securely, that just don’t have the budget for a good percentage of the secure file transfer products available.  Similarly, larger corporate organisations don’t want to be paying over the odds for potential solutions.  Pro2col have spent a great deal of time scouring the marketplace to select products that not only cater for all file transfer requirements, but that do so at an affordable cost!

As we are continually looking for ways to improve the services we provide to both existing and potential customers, Infosec was a great learning experience for us in terms of the security marketplace and a productive exercise for the business in terms of relationship building with customers and resellers.