Dropbox – Replace It or Embrace It?

Dropbox – Replace It or Embrace It?

Many IT Professionals have concerns about cloud based file sharing, and rightly so. Their valid concerns include lack of visibility, control, data leakage and governance.  Naturally Dropbox with its huge footprint has come under more scrutiny than most, but Dropbox is fighting back.

Firstly, they recently announced that they are cash-flow positive. Whilst it’s taken many years to achieve, it is a game changer as other major cloud based systems such as Box still are reliant on round after round of funding. Secondly, I’ve seen firsthand the major shift Dropbox has made in the UK, placing a considerable amount of effort into developing their corporate customer base. dropbox-business-user-increase I previously took the position that any company using Dropbox for corporate file sharing was quite frankly bonkers. They would suffer from the lack of controls afforded by on-premise Managed File Transfer installations and potential data protection issues regarding where the data resided. However, Dropbox has made impressive progress and their Business and Enterprise offerings are compelling, when blended with some of the 300,000+ integrated solutions accessing Dropbox via it’s Rest API.

Loosing Company Data

My biggest concern was data leakage. How could the business ensure that sensitive company information wasn’t being shared without the necessary controls provided via DLP integration with in-house solutions? A quick search and 30 seconds later I’d found three DLP vendors providing Dropbox integration! What about antivirus? Another search and another positive result! And reporting? Plenty of options! OK, so now Dropbox is a potential solution for some of our customers where ease of use is paramount and IT currently have limited control.


With the recent changes to GDPR (see article here), data residency is now an even more prevalent issue. Where a company chooses to keep its data can be mandated by their customer contracts and compliance policy. Where strict controls are required or customers demand UK data residency, an on-premise or a private cloud installation is the safe bet.

However, for those businesses keen to adopt and benefit from ‘Cloud’ offerings, GDPR presents a challenge, one that is an even bigger problem for cloud vendors, such as Dropbox. Dropbox is therefore opening its first European datacentre in Germany, which is in beta test with selected customers.

Gaining Visibility

There is little doubt that your end-users have adopted cloud solutions to fulfil their file sharing requirements. With hundreds available online, where do you start?

The first step is to gain visibility of what cloud solutions are being used, understanding which employees are accessing them and how much company data could potentially be in an unsanctioned tool.

Dropbox has hundreds of millions of registered users of their Free/Personal account offering. These were frequently created to facilitate file sharing of company data, and therefore created using company email addresses. dropbox_usage_inside_business_noborder As a Dropbox Partner we can help you get a much clearer view of the use of it within your organisation. If you would like more information on what company information you already have in Dropbox, drop us an email to [email protected] or call on 020 7118 9640.

Understanding Requirements

Your employees have needs that aren’t being fulfilled by company-sanctioned solutions, or maybe you just don’t have one. Either way you need to understand what the business needs in order to ensure the appropriate solution is provided. To help you ask the right questions of the business we’ve written an Expert’s Guide to Managed File Transfer.

When you know which employees are using unsanctioned tools and you’ve discussed with them what they need from a file sharing solution, you’re in a much stronger position to make an informed decision.

Replace It or Embrace It?

IT professionals will be driven to embrace or replace cloud-based technologies, depending upon the security and compliance posture adopted by their company.

Those that embrace Dropbox will benefit from an extremely simple-to-use solution, with an established user base and an increasingly feature rich solution. In my opinion, it should only be used by a business with the right integrated solutions to provide the visibility and control needed.

For those that choose to replace Dropbox to address compliance, data residency and integration, there are a wide range of solutions available. Many of the existing Managed File Transfer solutions provide person-to-person file sharing and some address synchronisation. These typically are modules that can be added to existing software installations.

Get Your Free Copy Of Our Brand New Book,

The Expert Guide to Managed File Transfer[transparent background] Small




The Expert Guide distills our knowledge from more than 700 Managed File Transfer
projects into a resource to get your project started.

It includes;

  • What is Managed File Transfer, EFSS, Big Data, High Availability etc?
  • What problems Managed File Transfer can solve? With real life use cases.
  • 40 key considerations for a successful Managed File Transfer project.
  • A side by side comparison of eight leading Managed File Transfer solutions.

Click Here Free Copy_Orange_#db5500

How do I monitor my Managed File Transfer system?

How do I monitor my Managed File Transfer system?

As we all know, any computer system tends to merge together multiple products, often from different vendors. These products may encompass scheduling, user authentication, monitoring, database and a whole host of others. Finding one unified product can sometimes be problematic, as very few vendors offer a lock, stock and barrel solution.


With regards to a Managed File Transfer (MFT) solution, most products contain a dedicated reporting component, available as either an integral part of the solution or as an additional module which can be purchased and installed separately.

However, when it comes to event monitoring, the majority of solutions have kicked the ball over the fence and instead have taken the approach of providing some simple methods to alert operators or administrators of potential problems. In this article, I’ll explore some of the ways that you can use these interfaces to best suit your needs.

Before you even consider how you want to interface your MFT to your monitoring, you need to take a long look at whether in fact something is worth monitoring. For example, would you want to be alerted when someone fails to login to your FTP server? If it’s a wannabe hacker and their IP address gets automatically locked out, then probably not. If it’s a production batch account, then probably yes. Think about your MFT system in component pieces and judge each part on its own merit. Just because you monitor some of it, you don’t have to do it all.

The Problem with Email

One of the easiest monitoring methods is to generate an email when something goes wrong; unfortunately this is also one of the biggest areas for monitoring failures, due to a couple of reasons.

First, relying on email does not preclude a failure or delays in your mail system. Emails can potentially get lost or marked as spam by the mail server if enough are generated. Secondly, if you are only notified of failure, but you don’t receive any emails, is your system working?

Simple Network Management Protocol

SNMP is a protocol designed for monitoring a network and its various devices. There are several monitoring solutions commercially available, however you need to check your MFT system to determine if it is able to create an SNMP trap; if not, you are limited to just monitoring the MFT server(s).

Log Watcher

Most monitoring tools contain a log watcher of some description. The monitoring solution can be set to read your log files on a regular basis and will generally remember which parts of the log have already been read. An alert is raised when a certain regular expression is encountered in the log file.

Be careful when using this approach that you do not inadvertently change the log levels of the MFT solution and that error text does not change with software upgrades.

Event Log

Some MFT solutions allow writing to Windows event logs, which you can then monitor with any commercial monitoring solution. On a Linux or Unix system, you would perhaps be checking the /var/log directory (system logs are written to /var/log/messages).


If your MFT solution writes log records to a database, use a query launched from the monitoring solution to routinely extract error events. Depending upon the frequency of execution, this can give near real-time results.

And Finally…Scripting

If your MFT product provides an API, why not use some scripting to generate events? A Cron or Windows scheduled task can routinely check directly into your system for noteworthy events.

Past the Interface

Now that you’ve worked out a way to get the events from your MFT system into your Monitoring solution, you need to consider how you want to be alerted. Of course, this is the responsibility of the monitoring solution, but consider how you would like to grade the events that you receive. Do they all require your immediate attention, or can you apply a priority to some, while others can wait. In practice, it makes sense to prioritise events before passing them to the monitoring solution.

Reactive recovery

Whichever method you use to pass events to the Monitoring tool, you may find that you also have the opportunity to execute certain activities when you detect an issue. Many monitoring tools possess this functionality (If you script your monitoring interface, this can be used too). A good example of this may be to restart a failed interface, or enable an alternate workflow.

Find Out the Top Ten Signs your MFT Software Needs a Health Check!


Top 10 Signs

Like any computer system, your managed file transfer solution needs to be kept well tuned in order to provide the best performance.

Learn to recognise the top ten signs that your Managed File Transfer solution needs a health check.

Download our guide now; ‘10 Common Signs your Managed File Transfer Solution Needs a Health Check‘.

Pro2col’s engineers are trained and accredited to provide a comprehensive solution health check of your managed file transfer software and environment. For more information get in contact today by calling our team on 020 7118 9640.

Do you need Managed File Transfer or just FTP?

Do you need Managed File Transfer or just FTP?

Secure File Transfer

FTP and SFTP servers perform 2 basic tasks:  “Put” and “Get.” You can put files on the FTP Server or get files from the FTP Server. If security is not a concern, FTP Server software is an easy and inexpensive way to accomplish this.

Typical use cases include backing up your Cisco Unified Call Manager (CUCM) or other applications, receiving non-sensitive data from business partners or enabling remote employees to upload non-confidential reports.

However, file transfer requirements have changed considerably since FTP was released back in 1971. End users are more aware of the risks of identity fraud, business partners require at least the base security levels outlined in ISO 27001, file sizes and volumes have grown exponentially and business workflows are ever more complex.   Add in the legal requirements of GDPR and UK Data Protection Act, the fines that can be levied by the ICO and the economic impact of a data breach and an unsecured “put/get” system fast becomes a liability.


Managed File Transfer (MFT)

Research conducted by Vanson Bourne, on behalf of IBM last year, identified the top three concerns that drove the implementation of Managed File Transfer as being security, complexity and growth.

The top three drivers for choosing MFT over FTP

Top three drivers for choosing MFT


When security and compliance are high priorities, MFT solutions do more than simply secure files in transit. Standard security features in MFT solutions include:

  • Support for secure protocols and refusal of unsecure connections
  • Encryption for stored data and the assurance that unencrypted versions of the file are never written to the server
  • Perimeter security, such as a reverse proxy that operates as a pass through and does not temporarily store data
  • Support for the current versions of privacy standards, such as PCI v3.1 and HIPAA
  • The ability to support security policies, such as complex/expiring passwords
  • Hacking detection with automated shut down of offending users or domains

MFT solutions can enable PCI compliance for the management of credit card data, secure workflows for the sharing of confidential data from patient records to patents and present a secure interface to your customers and partners for system to system or person to person transfers.



Files are transferred between applications, systems and end-users. The receipt of a file may trigger an entire workflow with complex what if routing and varied notifications. Automated “Push” and “Pull” technology, as well as the ability to automatically sort data and send to pre and post processing applications, is a key driver in the need for MFT.

  • Event-driven commands and notifications/alerts (e.g., “on file upload, do…”)
  • API or command-line tools
  • Integration with key systems such as Active Directory, Outlook, Sharepoint, Salesforce, anti virus and Data Loss Prevention solutions
  • Flexibility to handle differing customer requirements such as protocol, password protection and PGP.
  • Full reporting and notification on failures enabling prompt action to meet customer SLAs.

MFT is often described as the glue between different systems and elements of the business, linking processes and people, enabling information flows but confirming security standards are met. Most can be administered via a web based GUI, removing the need for scripting expertise.


Growth in File Size and Volume

Thanks to new software applications, increased storage capacity and a fundamental shift to online working, both the volume and size of files has grown exponentially. And the growth rate is just getting faster. It took 51 years for hard discs to reach 1TB and only a further 2 to reach 2TB. Our customers now regularly need to transfer files of over 5GB and frequently much more. With its lack of compression and checkpoint restart, FTP is not designed to manage these large file sizes. Lots of our customers are also struggling with the degree of manual intervention required by IT to set up new transfers via their FTP server and the lack of self service options.

  • EFSS options for end-users to transfer files under policy controlled conditions
  • Modules for file acceleration or UDP based protocols now available for many MFT solutions
  • Self service options for end-users
  • Web clients to simplify the movement of files over 2Gb via HTTP(S)
  • Auditing to ensure compliance and a full view of who is sharing what.
  • Automation of workflows to reduce the need for manual interventions.



So do you need Managed File Transfer? If you handle sensitive data, need to be PCI compliant, work with partners who demand security, have more transfers than you have man hours to handle, are juggling varied demands from different parts of the business or simply have to move increasingly large files, I would suggest MFT would be of benefit. As not all MFT solutions offer the same functionality, it’s important to determine the goals for your implementation and understand how the MFT functionality will be used before you start your product research.

Download a Comparison of 8 Leading Managed File Transfer Solutions!


MFT_Comparison Guide Img

In this essential pack you’ll also find…


  • Key features and frequently asked questions

  • Other business policies that will need to be considered

  • Access to additional resources

  • Side by side comprehensive comparison

    * Updated to include new vendors (October 2015)

5 Reasons to Replace Your Home Grown File Transfer System

5 Reasons to Replace Your Home Grown File Transfer System

Walk into any business and, the chances are, that all workflows and processes are using old fashioned scripts or home grown file transfer systems to transfer files in and out of the business. This is the way it has been done for literally decades and I even remember writing a few scripts myself in my early days in IT.

However this approach has several drawbacks:

5 reasons to replace home grown file transfer systems

5 Reasons to replace your home grown file transfer system.

1. Complexity

Scripts and home grown systems are written and often supported by a small team, sometimes just a single engineer. They are often poorly documented, if at all. All is well while they work, but if they need to be altered, or the engineers who wrote them move on to new roles, then changing them becomes a slow and costly exercise. New security standards can be tricky to integrate and using products from many vendors can be required to meet each separate transfer, often duplicating capabilities and adding to IT training and support burdens.

2. Limited Visibility and Control

Scripts are often deployed to address point solutions and little though is given to auditing and control of the scripts. Again this is fine while they work but trouble-shooting why they do not, or even auditing what has been sent. Checking the progress of in-transit files is almost impossible.

3. Employees circumvent IT

When sending data to outside organisations, employees want a painless process… Waiting for IT to go through change control to setup a process can take a long time, and for one off transfers, it is often easier to use free web based file sharing services. While this works, again, there is no auditing of what data or even how much data is leaving the organisation.  A recent infographic by Globalscape showed that 80% of respondents knew they were circumventing corporate policies when doing this and nearly ¾s thought the company approved of their actions.

4. Ensuring security

The lack of auditing and visibility of these methods, mean that there is no record of what is entering or more importantly, leaving the organisation. IT is left to be reactive when issues occur rather than proactive to prevent the issues in the first place. Data breaches by hackers may make the headlines but they are far more likely to be due to human error or by corporate policies being bypassed or even, in some cases, non-existent.

5. Insufficient IT resources

Finally, when all these issues are combined, IT is often left running around firefighting issues as they occur. Time to recover is typically longer than average and issues are normally resolved without understanding the root cause.

Introducing Managed File Transfer

Managed File Transfer (MFT) addresses all these problems by centralising transfers in and out of the organisation though a single system. Most MFT systems allow you to replace custom scripts with a pro-grammatical interface, making transfer jobs easy to understand. Auditing to a database allows IT departments to see what is coming in and leaving the organisation with log files being created making trouble shooting easier and quicker.

Employees no longer need to circumvent IT to setup their own transfers as IT can react quickly to new requests and with features such as Ad-Hoc Messaging, transfers and the need to use Internet based file sharing sites disappears too. All this can now be wrapped up in a set of security standards ensuring transfers meet regulatory compliance and corporate standards.

The net result of this is to reduce the time to resolve issues as they arise, detect and resolve underlying issues which may be causing individual jobs to fail, and to free up IT to be more proactive in preventing issues. In addition, MFT systems can scale up to meet future requirements, and support for High Availability, integration with automated monitoring systems and Data Loss Prevention can add extra layers to the file transfers.

If these issues sound familiar to you, you are not alone. At Pro2col we see some or all of these issues in a significant proportion of organisations we speak to and recommending and implementing MFT solutions can lead to a much more reliable and secure file transfer infrastructure. We are the UK experts in managed file transfer with hundreds of customers of all different sizes across a wide range of industries.

Download a Comparison of 8 Leading Managed File Transfer Solutions!


MFT_Comparison Guide Img

In this essential pack you’ll also find…


  • Key features and frequently asked questions

  • Other business policies that will need to be considered

  • Access to additional resources

  • Side by side comprehensive comparison

    * Updated to include new vendors (October 2015)

The Challenge of Big Data – It’s more than just big files!

The Challenge of Big Data – It’s more than just big files!

Big Data is a term that creeps up a lot these days and its meaning can be deceptive. Often data is thought of as just “Big Data” when the file size hits a certain size. But in reality the picture is less about size and more about complexity. Big data files do not need to be measured in terabytes, or even gigabytes, but the complexity of data and the inter-relationships between it and the other data sources inside an organisation, can make it more valuable than the raw data alone. Decreasing the time taken to process the information also means decisions can be made quicker or thought about longer and the value of the data increases.

Gartner predicts 2016 seeing big data moving on from the ingestion of data to the automating of the analytics and artificial intelligence (AI) being used to leverage the power of data, but before any of this can happen you need to have the data in the right location and format first.

Big Data Characteristics

Big Data Characteristics – The 4 V’s

Getting Big Data in…

The faster you can get data into your organisation, the sooner it can be analysed. In order to speed up the process of receiving the data into your organisation, several managed file transfer solutions are including proprietary, high-speed file transfer protocols. These are based around UDP streams or parallel TCP connections to increase bandwidth utilisations to over 90%.

The new protocols enhance data transfer rates significantly, enabling gigabytes of data to be delivered in less than a minute across the world, making for some impressive headlines. This technology is still in its early days, meaning than no open protocols offer this increased utilisation, opting for one approach over another is a personal preference.

Getting Big Data sorted…

Getting the data into your organisation is all well and good, but it is only half of the challenge. All the data needs to be analysed before it can become useful, however it arrives in your managed file transfer system from a variety of sources, in different formats and, almost invariably, not the format your central data analysis tool needs.

Two simple enhancements that can increase the efficiency and speed at which you ingest your data; firstly pushing data when it’s ready instead of waiting for it to be collected, and secondly triggering events when the file is received by your managed file transfer system.

Remove another step from the process…

Implementing a managed file transfer solution, which has the ability to stream files to a target server, provides productivity gains over traditional store and forward style workflows. By writing a large data set directly onto the intended target system, you’re able to remove another step in the process.

Once the latency has been pared back, the next stumbling block is getting the data into a useable format. There are literally hundreds of data standards and even the most common of these are often “augmented” with extra data my specific applications.

Integrating some form of data translation, often by post processing scripts or applications is a common approach. This works well until the next upgrade changes the “standard” slightly and the translation script needs to be edited or even re-written. Modern managed file transfer solutions provide the ability to transform data to be presented to the target system in a format that it recognises and can process. These can be simple XLS to XML conversion or much more complex EDI and database translation.

A growing requirement of managed file transfer…

The world of managed file transfer has evolved to enable companies that need to move big data, to do so as efficiently as possible. Streamlining the delivery of data (of varying types, sizes and structures), from external trading partners, onto internal big data analytics solutions, is becoming a much more common requirement from our customers.

If you’ve a Big Data file transfer project and would like our pre-sales and technical experts assistance, contact us here or call +44 (0) 20 7118 9640.

Download a Comparison of 8 Leading Managed File Transfer Solutions!


MFT_Comparison Guide Img

In this essential pack you’ll also find…


  • Key features and frequently asked questions

  • Other business policies that will need to be considered

  • Access to additional resources

  • Side by side comprehensive comparison

    * Updated to include new vendors (October 2015)

Some Thoughts on TCP Speeds

Some Thoughts on TCP Speeds

As a consultant in File Transfer technologies, a common complaint that I find myself having to address is the speed that a file travels at between two servers. Generally speaking, many people expect that if they have two servers exchanging files on a dedicated traffic-free 1 Gbps line, then their transfer speed should be somewhere close to this.


One of the first things to consider is the way that TCP works compared to UDP. No matter which protocol is used, data is broken into packets when being sent to the receiving computer. When UDP (User Datagram Protocol) is used, the packets are sent ‘blind’; the transfer continues regardless of whether data is being successfully received or not. This potential loss may result in a corrupted file – in the case of a streamed video this could be some missing frames or out of sync audio, but generally will require a file to be resent in its entirety. The lack of guarantee makes the transfer fast, but unless combined with rigorous error checking (as per several large-file-transfer vendors) it is often unsuitable for data transfers.

In contrast, TCP (Transmission Control Protocol) transfers data in a carefully controlled sequence of packets; as each packet is received at the destination, an acknowledgement is sent back to the sender. If the sender does not receive the acknowledgement in a certain period of time, it simply sends the packet again. To protect the sequence, further packets cannot be sent until the missing package has been successfully transmitted and an acknowledgment received.

Deliverability over speed / Calculating the Bandwidth Delay Product 

This emphasis on guarantee rather than speed brings with it a certain degree of delay however; we can see this by using a simple ping command to establish the round trip time (RTT) – the greater the distance to be covered, the longer the RTT. The RTT can be used to calculate the Bandwidth Delay Product (BDP) which we will need to know when calculating network speeds. BDP is the amount of data ‘in-flight’ and is found by multiplying the Bandwidth by the delay, so a round trip time of 32 milliseconds on a 100Mbps line gives a BDP of 390KB (data in transit).

Window Scaling

The sending and receiving computers have a concept of windows (‘views’ of buffers) which control how many packets may be transmitted before the sender has to stop transfers. The receiver window is the available free space in the receiving buffer; when the buffer becomes full, the sender will stop sending new packets. Historically, the value of the receiver window was set to 64KB as TCP headers used a 16 bit field to communicate the current receive windows size to the sender; however it is now common practice to dynamically increase this value using a process called Window Scaling. Ideally, the Receive Window should be at least equal in size to the BDP.

TCP speed fluctuations

The congestion window is set by the sender and controls the amount of data in flight. The aim of the congestion window is to avoid network overloading; if there are no packets lost during transmission then the congestion window will continually increase over the course of the transfer. However, if packets are lost or the receiver window fills, the congestion window will shrink in size under the assumption that the capacity of either the network or receiver has been reached. This is why you will often see a TCP download increase in speed then suddenly slow again.

TCP Speeds Diagram

A quick calculation…

One point to remember is that when talking about bandwidth, we tend to measure in bits; when referring to storage (window size or BDP) we are measuring in bytes. Similarly, remember to make allowance for 1Mb = 1000Kb, but 1MB=1024KB.

So, given this, a 1Gbps connection with a 60 ms round trip time gives a BDP of 7.15 MB (1000*60/8/1.024/1.024). As I mentioned, to fully utilise the 1Gbps connection, we must increase the Receiver Window to be at least equal to the BDP. The default (non-scaling) value of 64 KB will only give us a throughput of 8.74 Mbps: 64/60*8*1.024 = 8.738Mbps

So what can you do to speed up the transfer?

Logically, you would probably want to have the largest Receive Window possible to allow more bandwidth to be used. Unfortunately, this isn’t always a great idea; assuming that the receiver is unable to process the data as fast as it arrives you may potentially have many packets queued up for processing in the Receiving Buffer – but if any packet has been lost, all subsequent packets in the receive buffer will be discarded and resent by the sender due to the need to process them in sequence.

You also need to consider the abilities of the computer at the other end of the connection – both machines need to be able to support window scaling and selective acknowledgements (as per RFC1323).

Another option that you can investigate is the ability of several products to perform multithreading. Multithreaded transfers theoretically move quicker than single threaded transfers due to the ability to send multiple separate streams of packets; this negates somewhat the delays caused by having to resend packets in the event of loss. However the transfers may still be impacted by full receive windows or disk write speeds; in addition any file that has been sent via multiple threads needs to be reassembled on an arrival, requiring further resources. In general, most large-file transfer software is written around multithreading principles, or a blend of UDP transfer with TCP control.

Finally, consider other network users – when using large Receive Windows, remember that as the amount of data in transit at any time increases, you may encounter network usage spikes or contention between traffic.


If you have any questions about the speed of your file transfers or your chosen file transfer technology and infrastructure design give our team of experts a call on 0207 118 9640.

Download a Comparison of 8 Leading Managed File Transfer Solutions!


MFT_Comparison Guide Img

In this essential pack you’ll also find…


  • Key features and frequently asked questions

  • Other business policies that will need to be considered

  • Access to additional resources

  • Side by side comprehensive comparison

    * Updated to include new vendors (October 2015)