File Transfer Definitions
As with all products and solutions of a technical nature, the terms and abbreviations used to describe different technologies and processes can sometimes be a little confusing to say the least. Our file transfer definitions section aims to explain some of the more common terms and abbreviations that you may come across when browsing our website.
Ad Hoc File Transfer
The term “ad hoc” or “person to person” file transfer is better described as someone wanting to send a file to another person, generally on a ‘one-off’ basis.
To elaborate, lets set the scene. It’s 5.30 pm on a Friday afternoon. You have been working on a really important proposal for a new client that’s due for submission and you just HAVE to send it now. It could be that the file is too large for your email server to process as an attachment or that its super sensitive and you need visibility of receipt by your client rather than the send and forget approach of email. What do you do?
Its in scenarios such as this where ad hoc file transfer comes into it’s own. It offers businesses a quick and simple means of sending large or sensitive files minus the hassle of creating and managing end user accounts on FTP servers or stressing about whether it got there via email.
An ad hoc file transfer solution will quite simply allow you to create a single job, enter the email address of your recipient and press send. That’s it. No further details concerning your recipient are required and all they need in order to receive the file is a standard email account.
Ad Hoc file transfer typically is implemented by businesses that need to gain greater visibility of files leaving and entering the organisation. More often than not these days they’re implement to replace consumer grade solutions like Dropbox or other cloud based systems.
Do you have a requirement to send large or sensitive files on an ad hoc basis? If so and you’d like to find out more about the ad hoc file transfer solutions supplied by Pro2col, please don’t hesitate to contact us on 0333 123 1240.
AES (Advanced Encryption Standard) is an encryption algorithm or standard used to secure sensitive data. AES is a symmetric block cipher that encrypts (enciphers) and decrypts (deciphers) electronic information.
Evaluated and initiated by the National Institute of Standards and Technology, AES was adopted by the US government in 2002. It is one of the most popular algorithms used in symmetric key cryptography since it became a replacement for DES (Data Encryption Standard).
AES comprises of 3 block ciphers and currently supports 128, 192 and 256-bit keys and encryption blocks, but may be extended in multiples of 32 bits.
Automated File transfer
Data Protection Act
The Data Protection Act of 1998 was brought into force on March 1st 2000. Introduced to give UK citizens the right to access personal information held by ‘data controllers’ (any individual within an organisation handling personal data) within the United Kingdom, the Data Protection Act also details principles concerning the way in which this sensitive data is managed.
There are eight core principles covered under the Data Protection Act. These are as follows:
- Personal data should be processed fairly and lawfully.
- Data should only be obtained for specified purposes and should not be further processed in a manner incompatible with these purposes.
- Personal data should be adequate relevant and not excessive in relation to the purposes for which they were collected.
- Personal data should be accurate and where necessary kept up to date.
- Personal data should not be kept longer than is needed for its intended purpose.
- Personal data should be processed in accordance with the rights of the individual, which the information concerns.
- Appropriate measures should be taken against unauthorised or unlawful processing or destruction of personal data.
- Personal data should not be transferred outside the European Economic Area (the EU states plus Liechtenstein, Iceland and Norway).
The principle outlined within the Data Protection Act, applicable to the implementation of secure file transfer provisions is the seventh principle. This states that;
“Having regard to the state of technological development and the cost of implementing any measures, the measures MUST ensure a level of security appropriate to – the harm that might result from such unauthorised or unlawful processing or accidental loss, destruction or damage as are mentioned in the seventh principle AND the nature of the data protected.”
Therefore all organisations, as governed by UK law, must ensure that adequate safeguards are in place regarding the storage and processing of personal data.
Our specialists at Pro2col can help you to source and implement a secure file transfer solution to suit your business requirements and align the processing of data, in accordance with The Data Protection Act. Please contact us on 0333 123 1240 for more information.
This is the individual within an organisation who is responsible for the data. The data controller defines the data collected and the reasons for processing
Under GDPR, individuals have the right to have their personal data transferred to another system or organisation
Someone who processes data on behalf of the Data Controller.
Data Protection by Design & by Default
This is an overarching principle of GDPR. It means building data protection into business processes, products and services from the outset.
Data Protection Impact Assessment (DPIA)
This is a document that describes the nature of the data, the purpose of the transfer, how it is performed and the security configuration. A DPIA is a key requirement of GDPR.
This is the individual that the data is about.
Extreme File Transfer
Here at Pro2col we’re increasingly being asked by our clients to help them move large data sets. The amount of data as everyone knows is increasing in size, as are file sizes. It is now common in our discussions to talk about files of many Gigabytes in size. The challenge this presents however is how to move the data from point A to B as invariably we’re finding that companies are needing to move these volumes of data halfway around the world. Welcome to Extreme File Transfer! Extreme file transfer is an expression which has become more widely adopted in recent times, but what does it mean? IDC describes it as;
“Extreme file transfer requirements come from the need to solve problems around file size. In this case, the file may simply be too big and the target too far away to reliably deliver over TCP/IP because of shortcomings in this networking protocol. In other cases, there is a problem delivering a file within an allowed time window, and therefore, there is a need to find an alternative approach.” [“IDC competitive review of MFT Software” October 2010]
IDC pretty much hits the nail on the head here, although I’d place a little more emphasis on the infrastructure over which extreme file transfer takes place. The efficiency of TCP based file transfer protocols such as FTP reduce dramatically when the round trip time between client and server increases due to latency. The result is an increase in packet loss and rapidly decreasing throughput. For file based workflows there are some great solutions available which addresses these issues. Vendors have taken various approaches such as breaking files into smaller bits, using hardware acceleration, compression, synchronisation, shipping a HDD but the most effective are those that utilise a variation of the UDP protocol. UDP itself isn’t a very successful protocol but when some controls are added it becomes the most effective way of moving extremely large files over extremely challenging connections. These re-engineered protocols form the cornerstone of the Extreme File Transfer solutions. The vendors in this space have designed their solutions to enable the use of the extreme file transfer protocol for these typical business requirements:
- Disaster recovery and business continuity
- Content distribution and collection, e.g., software or source code updates, or CDN scenarios
- Continuous sync – near real time syncing for ‘active-active’ style HA
- Supports master slave basic replication, but also more complex bi-directional sync and mesh scenarios
- Person to person distribution of digital assets
- Collaboration and exchange for geographically-distributed teams
- File based review, approval and quality assurance workflows
If your business needs to transfer extremely large files or volumes of data speak to our team of expert consultants. As independent file transfer specialists since 2003 we’re able to provide an objective vendor agnostic view to finding the right solution for your file transfer needs. Speak to one of our consultants now on 0333 123 1240.
Federal Information Processing Standards (FIPS)
Federal Information Processing Standards (FIPS) are a series of standards, outlining the requirements that IT products must satisfy, to be acceptable for use by US Federal government agencies and contractors. Developed by the National Institute of Standards for Technology (NIST), the process of FIPS validation ensures that technology products are rigorously tested and deemed sufficiently secure enough to deal with sensitive data.
There are a number of different FIPS standards including 186-2 – Digital Signature Standard, 190 – Guideline For The Use Of Advanced Authentication Technology Alternatives, 197 – AES etc. but by far the most significant standard in terms of secure data transfer is FIPS 140:
FIPS 140 defines the requirements and standards that must be met by cryptographic modules (components) used in computer hardware and software solutions. As IT solutions are used in different departments and environments, the scope of cryptographic requirements imposed by FIPS has been broken down into eleven distinct areas and four increasing, qualitative security levels. They are as follows:
- Cryptographic module specification (what must be documented).
- Cryptographic module ports and interfaces (what information flows in and out, and how it must be segregated).
- Roles, services and authentication (who can do what with the module, and how this is checked).
- Finite state model (documentation of the high-level states the module can be in, and how transitions occur).
- Physical security (tamper evidence and resistance, and robustness against extreme environmental conditions).
- Operational environment (what sort of operating system the module uses and is used by).
- Cryptographic key management (generation, entry, output, storage and destruction of keys).
- EMI/EMC (electromagnetic interference/electromagnetic compatibility).
- Self-tests (what must be tested and when, and what must be done if a test fails).
- Design assurance (what documentation must be provided to demonstrate that the module has been well designed and implemented).
- Mitigation of other attacks (if a module is designed to mitigate against, specific attacks then its documentation must say how).
If you are purchasing a FIPS accredited solution, you can rest assured the product has been rigorously tested and is physically secure enough to protect your sensitive data.
Our specialists at Pro2col can help you to source and implement a FIPS accredited, secure file transfer solution to suit your business requirements. Please contact Pro2col on 0333 123 1240 for more information.
FTP File Transfer
The FTP File Transfer Protocol is a method used to transfer files from one computer to another through a network whether thats an internal network (from one computer to another within the same network) or more commonly a Wide Area Network such as the Internet.
An FTP site is a server, hosted on the Internet and used as an exchange area for uploading and downloading files to and from. FTP sites are accessed using a software program known as an FTP Client. All FTP sites will have a hostname and this, along with a username and password assigned to you by the FTP site administrator, will be required to connect the FTP client to the site.
Once connected to the FTP Site, the FTP Client allows the user to browse through files and folders on both their personal computer and the FTP site. Files can then be selected and either uploaded to the FTP site or downloaded from the FTP site.
FTP is not a particularly simple file transfer protocol to use and has a number of drawbacks such as high latency (slow), a lack of reporting and no real security of data but it is still widely used as it is a cheap, often free solution.
Pro2col offers a wide range of secure FTP alternative solutions – please contact us on 0333 123 1240 if you would like to find out more.
FTPS File Transfer
FTPS File Transfer, FTP Secure or FTP-SSL as it can be referred to, is a secure means of sending data over a network. Often misidentified as SFTP (an independent communications protocol in its own right), FTPS describes the sending of data using basic FTP run over a cryptographic protocol such as SSL (Secure Socket Layers) or TLS (Transport Layer Security).
Cryptographic protocols ensure that sections of the connection established between a client and server are encrypted, thus maintaining the security and integrity of the data sent. They similarly use a public/private key authentication system to encrypt data before it is sent and decrypt it once it has been received. Therefore if the data stream were interrupted during transmission, any documents in transit would be illegible to hackers or eavesdroppers.
General Data Protection Regulation (GDPR)
The new EU regulation for handling personal data – in place from 25th May 2018.
The Gramm-Leach-Bliley Act of 1999, also known as The Financial Modernisation Act, details regulations that financial institutions must be adhered to, in order to protect consumers’ financial information. The GLBA law governs all financial institutions that hold what is classed as ‘personal data’ including, insurance companies, security firms, banks, credit unions and retailers providing credit facilities.
Gramm-Leach-Bliley Rules and Provisions
The privacy requirements set out in GLBA are broken down into three distinct elements; the Financial Privacy Rule, Safeguards Rule and Pretexting Provisions.
The Financial Privacy Rule – Governs the collection of consumer’s private financial data by financial institutions, also including companies that deal which such information. It requires all financial institutions to provide privacy notices to their customers prior to the establishment of a relationship. Such privacy notices should also detail the institutions’ information sharing practices and give consumers the right to limit the sharing of their information in certain instances.
The Safeguards Rule – requires all financial institutions to record and implement a security plan that protects the confidentiality of their customer’s personal data.
The Pretexting Provisions – Pretexting refers to the use of unsolicited means, in order to gain access to non-public, personal information e.g. impersonating an account holder on the phone to obtain personal details. GLBA requires those governed by the law, to implement adequate provisions to safeguard against Pretexting.
What are the implications of Gramm-Leach-Bliley in terms of file transfer?
In order to comply with GLBA when transferring sensitive data, financial institutions must ensure that they;
- Prevent the transmission and delivery of files and documents containing non-public personal information to unauthorised recipients.
- Document delivery and receipt is enforced through enterprise-defined policies.
- Provide detailed logs and audit trails of content access, authorisation, and used.
Our specialists at Pro2col can help you to source and implement a GLBA compliant, secure file transfer solution to suit your business requirements. Please contact Pro2col on 0333 123 1240 for more information.
HTTP File Transfer
HTTP File Transfer (Hypertext File Transfer Protocol) is a set of rules for exchanging files on the World Wide Web. HTTP defines how messages are formatted and sent, as well as the actions web servers and browsers should take in response to commands.
A browser is used to send an HTTP command to a web server and establish a TCP connection (Transmission Control Protocol). The web server then sends HTML pages back to the user’s browser these are what we would refer to as webpages. For example, when you enter a URL in a web browser, this actually sends a HTTP command to the web server, instructing it to fetch and transmit the requested webpage.
HTTP file transfer can also be used to send files from a web server to a web browser (or to any other requesting application that uses HTTP). It is referred to as a stateless protocol as each command is independent of another. The connection established between the browser and the web server is closed as soon as the web server responds to the initial command. In contrast, FTP is a two-way file transfer protocol. Once a connection is established between a workstation and a file server, files can be transferred back and fourth between the two entities.
For further information on the HTTP file transfer solutions we provide, please contact us on 0333 123 1240 or take a look at our;
Guide to Managed File Transfer
Guide to Ad Hoc File Transfer
HTTPS File Transfer
HTTPS file transfer describes the combination of HTTP (Hypertext Transfer Protocol) and a secure protocol such as SSL or Transport Layer Security (TLS). It is used to send sensitive data over unsecured networks, for example the Internet.
These individual protocols operate on different levels of the ‘network layer’, derived from the TCP/IP model to create HTTPS. The HTTP protocol operates at the highest level of the TCP/IP model (the application level) and is used to format and send data, whereas SSL works at a slightly lower level (in between the application layer and the transport layer), securing the connection over which data will be sent.
HTTPS file transfer is the primary file transfer protocol used to secure online transactions. By default this protocol uses port 443 as opposed to the standard HTTP port of 80. URL’s beginning with HTTPS, indicate that the connection between client and browser is encrypted using SSL.
For more information regarding the HTTPS file transfer solutions that we provide, contact Pro2col on 0333 123 1240 or take a look at our;
Guide to Managed File Transfer
Guide to Ad Hoc File Transfer
Internet Protocol Suite
The Internet Protocol Suite is a term used to describe the set of communication protocols, developed individually by the IT community, for sending data over computer networks such as the Internet. TCP (Transmission Control Protocol) and IP (Internet Protocol) were the first two protocols included in the Internet Protocol Suite and are the basis from which the term originated.
Sometimes referred to as TCP/IP, The Internet Protocol Suite as a whole consists of a number of internet working protocols that operate in a ‘network layer’. Each of these layers are designed to solve a specific issue affecting the transmission of data. Higher layers are closer to the user and deal with more abstract data, relying on lower layers to convert data into forms that can be physically manipulated for transmission.
To elaborate, please refer to table (1a), which breaks down the layers included in TCP/IP suite and explains each layer’s function and possible protocols that can be used to fulfill these functions.
For more information regarding file transfer technologies which make use of the TCP/IP Stack, contact Pro2col on 0333 123 1240 or take a look at our;
Guide to Managed File Transfer
Guide to Ad Hoc File Transfer
ISO 27001 is an Information Security Management Standard (ISMS), published in October 2005 by the International Organisation for Standardisation (ISO) and the International Electrotechnical Commission (IEC).
Essentially an updated version of the old BS7799-2 standard, ISO 27001 provides a model for establishing, implementing, operating, monitoring, reviewing, maintaining and improving a documented Information Security Management System within an organisation. Taking into consideration a specific organisation’s overall perceived risk, it details requirements for the implementation of security controls, suited to the needs of individual businesses.
Many organisations will have information security controls in place, but what many are lacking (and what ISO 27001 covers) is the need for a management approach to these controls.
ISO 27001 Standards
The ISO 27001 standard is an optional certification that provides a structured approach when implementing an Information Management System. If an organisation takes the decision to adopt this standard, the specific requirements stipulated by ISO 27001 must be followed, as auditing and compliance checks will be made.
ISO 27001 requires that management within the organisation must:
- Systematically assess the organisation’s information security risks, taking account of the threats, vulnerabilities and impacts.
- Design and implement a coherent and comprehensive suite of information security controls and/or other forms of risk treatment (such as risk avoidance or risk transfer) to address those risks that it deems unacceptable.
- Adopt an all-encompassing management process to ensure that the information security controls continue to meet the organisation’s information security needs on an ongoing basis.
What are the implications of ISO 27001 in terms of file transfer?
If your organisation has adopted the ISO 27001 Information Security Management standard, you must ensure that any file transfer solution purchased, will adhere to your implemented IMS.
Our specialists at Pro2col can help you to source and implement a ISO 27001 certified, secure file transfer solution to suit your business requirements. Please contact Pro2col on 0333 123 1240 for more information.
LAN (Local Area Network)
LAN is the abbreviation used to describe a Local Area Network. The term “Local Area Network” refers to a computer network that covers a small physical area, usually confined to one building or a small group of buildings e.g. a home network or a business network.
A LAN is usually implemented to connect local workstations, servers and devices. This enables documents held on individual computers and servers to be accessed by any workstation within the LAN network. It also allows devices such as printers to be shared between workstations.
As a LAN includes a group of devices within a close proximity, they are usually able to transmit data at a fast rate as compared to Wide Area Networks. They are also relatively inexpensive to set-up as they use hardware such as Ethernet cables, network adaptors and hubs.
The term latency is an expression for the period of time taken to send a data packet from a source to the intended destination, the higher the latency the slower the data transmission. This incorporates all elements of the file sending process – including encoding, transmission, and decoding.
Certain delivery protocols such as FTP are particularly susceptible to latency. When sending packets of data to the remote site the sending site waits for an acknowledgment that the packet has been received before sending the next one, thus making the problem extremely time consuming in the event of high latency. In extreme cases of latency the time that it takes for the delivery of data and then listening out for the reply can result in the data throughput levels dropping to an significantly low level rendering the solution useless.
There are several ways to combat this, one being to utilise a multi-threaded TCP protocol – working in the same manner as above just that many other packet transfer requests are made increasing the throughput. Another increasingly popular route is to adopt a UDP based delivery protocol which adopts a send and forget mentality i.e. not waiting for the acknowledgement receipt. This can significantly speed up the delivery process but other features are required, as UDP out of the box won’t work for everyone.
Network tools like ping tests and traceroute measure latency by determining the time it takes a given network packet to travel from source to destination and back, the so-called round-trip time. Round-trip time is not the only way to specify latency, but it is the most common. To test the latency on your Internet connection between 100’s of test servers go to Speedtest.net where you can test your bandwidth and latency against a local (london) server against say one in Bangkok. On DSL or cable Internet connections, latencies of less than 100 milliseconds (ms) are typical and less than 25 ms desired. Satellite Internet connections, on the other hand, average 500 ms or higher latency.
If you suffer from latency problems when it comes to file transfer, please contact Pro2col on 0333 123 1240 for more information on how you can combat this problem.
A leased line is a dedicated, communications line set up between 2 end points by a telecommunications specialist. Not physical in nature, leased lines are in reality a reserved circuit and do not have a telephone number, each side of the circuit being permanently connected to the other.
Leased lines can be used for telephone communications, sending data as well as Internet services, and provide a much higher bandwidth than existing lines offered by Internet Service Providers.
The main advantage associated with leased lines is the increased bandwidth they provide. As they are sole, dedicated lines any congestion that occurs in shared lines is eliminated therefore the speed of communication (latency) is greatly increased. These lines are usually purchased by large organisations that regularly use the Internet and wish to obtain a faster, more reliable Internet connection.
If you are experiencing slow file transfer speeds, then contact Pro2col on 0333 123 1240 to find out more about how this issue can be resolved.
Managed File Transfer
Managed file transfer is an industry term used to describe a hardware or software solution that facilitates the movement of large files both inside and outside of the business, whilst maintaining the security and integrity of sensitive data. Although many managed file transfer solutions are built using the FTP file transfer protocol, the phrase was coined to illustrate those solutions that have progressed and developed, to address the disadvantages associated with basic FTP for large file transfer.
A solution classed as providing ‘managed file transfer’ should possess value added features such as reporting (e.g. notification of successful file transfers), increased security and greater control of the file transfer process. These solutions enable organisations to automate, self-manage and secure the transfer of files between each other. As companies expand the need to transfer files between locations increases, these features are invaluable to enterprises responsible for sensitive customer data.
The development of managed file transfer solutions has had an enormous positive impact on businesses processes. File transfer accounts for a significant percentage of man-hours in a number of market sectors, spent sending and monitoring transmissions. Managed file transfer eliminates the need for manual processes as these file transfer solutions are designed specifically to do the job for you.
To find out more about managed file transfer and how it can help increase efficiencies within your organisation, please contact Pro2col on 0333 123 1240.
Still confused? Download this e-book;
The concept of a network layer or ‘layered network’, was developed to account for the rapid changes that occur in technology. This concept allowed for the inclusion of newly developed protocols to work alongside one another to achieve a specified task, for example a secure file transfer.
The Higher layers of a network are closer to the user and deal with more abstract data, relying on lower layers to convert data into forms that can be physically manipulated for transmission.
The separate network layers are designed to perform a specific task, each layer passing information up and down to the next subsequent layer as data is processed.
Payment Card Industry Data Security Standard (PCI DSS)
The PCI Security Standards Council is an open global forum and was formed in 2006 – the 5 founding global payment brands include:
American Express, Discover Financial Services, JCB International, MasterCard Worldwide and Visa Inc.
A Global Security Standard, PCI DSS comprises of 12 comprehensive requirements designed to enhance the security of cardholder data. The most poignant of these requirements in terms of large file transfer are:
- Requirement 3: Protect stored cardholder data.
- Requirement 4: Encrypt transmission of cardholder data across open, public networks.
- Requirement 6: Develop and maintain secure systems and applications.
- Requirement 9: Restrict physical access to cardholder data.
- Requirement 10: Track and monitor all access to network resources and cardholder data.
Companies that do not comply with PCI DSS are liable to incur operational and financial consequences enforced by the individual payment brands. To find out more about how to become PCI Compliant, please click here.
Alternatively, if you’d like to find out more about the secure file transfer solutions in our portfolio that will help you to achieve PCI compliance, please contact Pro2col on 0333 123 1240.
In the world of IT, packet or packets is the term used to describe a unit of data, such as bytes or characters. When sending data over a network, messages or files are broken down into manageable packets before transmission. These packets can also be referred to as a datagram, a segment, a block, a cell or a frame, depending on the protocol used to break it down. Once they have been transmitted, the packets are re-assembled at the receiving end to create the original data file.
The structure of a packet can vary depending on which protocol is used to format it. Typically a packet consists of a header and a payload. The header carries information regarding the reassembly of the packets e.g. where they came from, where they are going to and in what order. The payload refers to the data that it carries.
Personal data means any data that makes a living person identifiable. This could be ‘direct’, such as their name, or ‘indirect’. This is where combined information could identify the person. GDPR refers to special categories or sensitive data. This includes information about racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, details of health, sex life or sexual orientation, email addresses and IP addresses. It also includes genetic data or biometric data for the purpose of identifying someone.
Right to Erasure
Under GDPR, the data subject has the right to request erasure of personal data.
Sarbanes Oxley (SOX)
The Sarbanes Oxley Act is a US federal law, enacted on 30th July 2002, governing financial reporting and accountability processes within public companies. The legislation was brought into force as a safeguard, following a succession of corporate accounting scandals, involving a number of high profile organisations. These companies purposefully manipulated financial statements, costing investors billions of dollars.
Sarbanes Oxley (SOX) contains 11 titles, detailing specific actions and requirements that must be adopted for financial reporting, ranging from corporate board responsibilities to criminal penalties incurred as a consequence of non-compliance. The most significant of these titles in terms of data transfer is section 404.
Sarbanes Oxley Standards
Section 404 states companies governed by SOX are required to:
- Publish information in their annual reports, stating the responsibility of management for establishing and maintaining an adequate internal control structure and procedures for financial reporting, detailing the scope and adequacy.
- Include an assessment of the effectiveness of internal controls.
What are the implications of SOX in terms of file transfer?
In order to provide this information and ensure compliance with US law, public accounting companies must implement large file transfer processes that ensure:
- The accurate recording of all financial data, including auditing logs.
- Regulate access to and modification of all financial data by unauthorised users.
- Track activity of data as it crosses application and organisational barriers.
Our specialists at Pro2col can help you to source and implement a SOX compliant secure file transfer solution to suit your business requirements. Please contact us on 0333 123 1240 for more information.
Secure File Transfer
Security is of paramount importance in todays corporate environments, due to the sensitive nature of the information that they hold. Industry standards such as PCI DSS, Sarbanes Oxley and HIPAA dictate an organisations responsibility to secure such information and as such, the need for secure file transfer solutions has become a priority.
A number of secure file transfer protocols have been developed over the years as a solution to the issue of data security. As there are several different ways of sending and receiving files e.g. HTTP (via a web browser), FTP (client to server) and Email, there are a variety of security protocols used to secure communication channels. The key secure file transfer protocols include:
- FTPS (FTP run over SSL/TLS)
- SFTP (SSH – Secure Shell protocol)
The FTPS and HTTPS file transfer protocols send files over a TCP connection. This TCP connection has a TLS(transport layer security) or SSL (secure socket layer) security layer that runs beneath the FTP and HTTP protocols. SSL simplified is a protocol that establishes an agreement between a client/browser and a server. The sending side uses a public key to encrypt a data file, which is then sent using either the FTP or HTTP file transfer protocol. The receiving side will have, as part of the agreement, a unique private key that can decrypt the data file.
SFTP is not FTP run over SSH, but rather a stand alone secure file transfer protocol designed from the ground up. The file transfer protocol itself does not provide authentication and security; it expects the underlying protocol (SSH) to secure this. SSH is used to establish a connection between a client and server that acts like an encrypted tunnel. This encrypted tunnel protects any data files that are sent via this secure connection.
If you want to find out more information about secure file transfer solutions, please contact Pro2col on 0333 123 1240.
SFTP File Transfer
SFTP file transfer or the ‘SSH file transfer protocol’ as it is more formally known, is a network communications protocol used for sending data securely over a network. A common misconception associated with SFTP is that uses FTP run over SSH – this is not the case. SFTP, sometimes referred to as ‘secure file transfer protocol’, is an independent protocol that has been developed from scratch.
Built to look and feel like FTP due to it’s popularity, SFTP was developed as a secure alternative to FTP. Based on the Secure Shell protocol, SFTP encrypts both the data before transmission and the commands (using SSH key-exchange algorithms) between servers. This provides dual protection against eavesdroppers and hackers, ensuring any data sent using this SFTP remains totally secure.
Not only a dedicated file transfer protocol, SFTP enables permission and attribute manipulation, file locking and more functionality. Saying this, a draw back associated with the use of SFTP is the limited number of client applications available that are actually compatible with SFTP servers.
If you would like to know what Pro2col can do for you in terms of secure file transfer, please contact us on 0333 123 1240.
SSH File Transfer
SSH (Secure Shell) is a network protocol used to establish a secure connection between a client and server. Once a connection has been established, it acts like an encrypted tunnel down which data can be exchanged securely. SSH file transfer is used to maintain the confidentiality and integrity of data communications over insecure networks such as the Internet.
SSH file transfer also provides password authentication using public-key cryptography. This form of cryptography works on a Private Key, Public Key system – the sending side has one of each, as does the receiving side of the transfer. Messages are encrypted with the recipient’s public key and can only be decrypted with the corresponding private key.
Originally developed as a replacement for Telnet and other insecure remote shells SSH is used principally on Linux and Unix based systems to access shell accounts.
SSL File Transfer
SSL (Secure Socket Layer) is a cryptographic protocol used when sending files over TCP (Transmission Control Protocol) networks, for example the Internet. SSL file transfer works by using two keys, a public key that encrypts the data file before it is sent and a private key only known to the server receiving the message, which decrypts the data file, enabling it to be read. These keys are obtained once an SSL certificate has been purchased for the applicable web server, to authenticate and establish an SSL connection between a browser and a web server. This secure connection or encryption system ensures that all data transmitted between the browser and the web server remains private and integral. Even if a third party did obtain the data file, they would not be able to read the information contained within it.
The SSL protocol is independent of the file transfer protocol used to transfer files; therefore it can be used in conjunction with HTTP, FTP, POP and more. It is the standard protocol used when purchasing items online from an ecommerce site and can be identified when in use by observing the web server URL. The URL of a standard web server will start with http://, the URL of a secure web server will start with https://
Subject Access Requests (SARs)
Under GDPR, the data subject has the right to request all personal data a data controller has on them. This includes their supply chain.
TCP (Transmission Control Protocol)
TCP (Transmission Control Protocol) is one of the two core protocols used in Data Communications, the second core protocol being IP. Part of the Internet Protocol Suite (often referred to as TCP/IP), TCP is a transport layer responsible for higher-level operations. It provides reliable, ordered delivery of data packets between two locations and can also offer management services such as controlling message size, network congestion and rate of exchange.
The Health Insurance Portability and Accountability Act (HIPAA)
HIPAA is the abbreviation of ‘The Health Insurance Portability and Accountability Act’. It is a US federal law governing the protection and privacy of sensitive, patient health care information. Proposed in 1996 by Congress, HIPAA was finally brought into enforcement by the Department of Health and Human Services (HHS) in 2001.
The objective of HIPAA is to encourage the development of an effective health information system. Likewise, the standards introduced must strike a balance between efficiently transmitting health care data to ensure quality patient care, whilst enforcing all necessary measures to secure personal data. This goal was achieved by establishing a set of standards relating to the movement and disclosure of private health care information.
HIPAA incorporates administrative simplification provisions, designed to help with the implementation of national standards. As such, HIPAA is broken down into 5 core rules and standards. The HHS assigned government bodies, such as the OCR (Office for Civil Rights) and CMS (Centers for Medicare & Medicaid Services) to organise and enforce these rules and standards. The OCR was assigned to administer and enforce the Privacy Rule and more recently, the Security Rule. CMS implements and governs electronic data exchange (EDI) including Transactions and Code Set standards, Employer Identification Standards and the National Identifier Standard.
HIPAA Rules and Standards
Privacy rule: Addresses the appropriate safeguards required to protect the privacy of personal health information. It assigns limits and conditions concerning the use and disclosure of personal information held by healthcare organisations or any other businesses affiliated with these organisations.
Security Rule: The Security Rule complements the Privacy Rule but focuses specifically on Electronic Protected Health Information (EPHI). It defines three processes where security safeguards must be implemented to ensure compliance: administrative, physical, and technical.
Transactions and Code Set Standards: In this instance, the term transactions, refers to electronic exchanges involving the transfer of information between two parties. HIPAA requires the implementation of standard transactions for Electronic Data Interchange (EDI) of health care data. HIPAA also adopted specific code sets for diagnosis and procedures to be used in all transactions.
Employer Identification Standards: HIPPA requires that employers have standard national numbers that identify them on all transactions – The Employer Identification Number (EIN)).
National Identification Standards: All healthcare organisations that qualify under HIPAA legislation, using electronic communications must use a single identification number (NPI) on all transactions.
What are the implications of HIPAA in terms of file transfer?
To ensure compliance with HIPAA in terms of large file transfer, Healthcare organisations must:
- Protect the privacy of all individually identifiable health information that is stored or transmitted electronically.
- Limit disclosures of protected health information whilst still ensuring efficient, quality patient care.
- Enforce stringent requirements for access to records.
- Implement policies, procedures and technical measures to protect networks, computers and other electronic devices from unauthorised access.
- Effectuate business associate agreements with business partners that safeguard their use and disclosure of PHI.
- Update business systems and technology to ensure they provide adequate protection of patient data.
Our specialists at Pro2col can help you to source and implement a HIPAA compliant, secure file transfer solution to suit your business requirements. Please contact us on 0333 123 1240 for more information.
TLS File Transfer
TLS file transfer (Transport Layer Security) uses a cryptographic protocol used to secure communications over networks such as the Internet. TSL is a more recently developed secure file transfer protocol compared to SSL(Secure Socket Layers) but works upon the same principles as SSL.
The TLS file transfer protocol is made up of two layers. The first, the TLS record protocol encrypts segments of the network connections at the Transport Layer of the Internet Protocol Suite, ensuring any data sent over this connection remains secure.
The second is referred to as the TLS handshake and uses 2 cryptographic keys – a public key that encrypts a data file before it is sent and a private key only known to the server receiving the message, which decrypts a data file, enabling it to be read.
If you refer to the TCP/IP model, the TLS protocol runs somewhere in between the Transport layer and the Application layer and it run independently of any other transport protocols such as FTP or HTTP.
As a somewhat abstract definition, it is crucial to understand the context in which we are using the word ‘virtual’ before moving onto the definition of virtualisation. The term virtual, in this scenario is defined as “computing not physically existing as such but made by software to appear to do so”.
Virtualization as a concept represents the ‘virtual’ partitioning or division of a single computing entity and it’s resources, whether that entity be a server, application, network, storage or operating system. Alternatively, you can interpret the concept from an almost opposing stand point and view it as multiple computing entities being combined to appear as one logical entity through a virtualisation layer. Consequently, there are many different forms of virtualization; in this instance the focus is server virtualization.
Originally devised by IBM in the 1960’s to partition large mainframe hardware – in recent years virtualization technology has been adopted and developed to apply to the now predominant x86 platform. Server virtualization software enables users to virtualise a piece of hardware, including its components i.e. Hard Disk, RAM and CPU. The functionality from each component can then be assigned as desired to run multiple applications, operating systems or appliances on a single piece of hardware in virtual partitions.
There are multiple advantages associated with virtualization. The segregation of expensive computer resources increases efficiency by consolidation – reducing the number of physical servers necessary to support a businesses IT solutions. This can save companies large amounts of money on hardware acquisition as well as rack-space, which comes at a premium. Additional advantages include quick deployment, increased security, centralised management, business continuity and disaster recovery, reduced administration and reduction in energy consumption minimising carbon footprint – just to name a few.
Of course as with any IT solution there are also disadvantages accompanying this technology. For example cost of licencing, user complexity, support compatibility, security management issues and deployment dilemmas (e.g. choosing the right solutions to host in a virtual environment as not all are suitable), but with an experienced IT team leader most of these issues become insignificant.
WAN (Wide Area Network)
A network that spans a wide geographical area is referred to as a WAN (Wide Area Network). A WAN consists of a collection of LAN’s (Local Area Networks) connected by a router that maintains both the LAN information and the WAN information. The WAN side of the router will then connect to a communications link such as a leased line (very expensive dedicated line), existing telephone lines or satellite channels.
This form of networking can facilitate communications between computers on opposite sides of the world and the most commonly known WAN in today’s society is the Internet. The majority of WANs are not owned by a specific company, e.g. the Internet, but rather exist under collective or distributed ownership and management. Saying this, LANs can be built specifically to enable private communication between different buildings within the same organisation that are remotely situated.