After building services that allows businesses to physically move their data from local data centres and on-premise solutions to Google, the search engine and cloud giant has now built another complementary solution to make the process even smoother.
Transfer Service is a new part of Google Cloud designed to move the data digitally. It is first and foremost for businesses with “billions of files” and “petabytes of data”, and should bear the brunt of the work, validating the integrity of the data, as it moves it to the cloud.
The service will use as much bandwidth as it has on its disposal, to make sure transfer times are as short as they can be. Any potential failures are automatically handled by the agent.
Google promises a relatively painless process. All the business needs to do is install the agent on the local server and select the directories that need moving. The rest is handled by the service itself. Obviously, the business can monitor and manage the transfer through the Google Cloud console.
Even though the main benefits and key selling points seem to be archiving and disaster recovery, it was said that Google also wants to onboard organisations looking to shift workloads and use machine learning to analyse data.
“I see enterprises default to making their own custom solutions, which is a slippery slope as they can’t anticipate the costs and long-term resourcing,” Senior Analyst Scott Sinclair says in a Google blog post .
“With Transfer Service for on-premises data (beta), enterprises can optimize for TCO and reduce the friction that often comes with data transfers. This solution is a great fit for enterprises moving data for business-critical use cases like archive and disaster recovery, lift and shift, and analytics and machine learning.”