As the cloud becomes a realistic hosting destination and technology solution for more and more organizations, it is becoming obvious that a black-and-white switch to the cloud is unlikely to happen in most real-world scenarios. This is especially true when an organization’s IT footprint is large, complex, regulated, and/or highly secured.
Business analytics faces a similar prospect, with many large enterprise-sized BI deployments remaining on-premises, even with the advent of ‘cloud-first’ (or ‘cloud-only’) analytics offerings from companies like Microsoft, SAP, Looker, and Qlik, to name a few. In many cases, larger companies are exploring the cloud option for analytics. Still, they are daunted by the prospect of moving their entire data operation to the cloud—and the idea is often killed off soon.
In other cases, the cloud option is used for limited scenarios where the data can be comfortably hydrated from an on-premises location to the cloud. But copying the data to the cloud—‘data shipping’—comes with its own challenges. It can be counter-productive, offsetting cloud efficiencies while introducing a new chore: keeping the cloud data fresh, hydrated, and accurate. Any hope of achieving tight data governance is challenged by data shipping (duplicates?), while the timeliness of any analyses against the data suffers.
So, it stands to reason that organizations and CDOs are interested in the “hybrid” option—where they get to run cloud-based analytics without the ‘data shipping’ headache. To this, most companies have so far offered the dreaded private network solution (i.e., VPN) or some other method for securing a channel between the cloud and the on-premises data source. (‘On-premises’ should be expanded to ‘remote data’, covering any data source not local to the cloud implementation.)
Apart from the infrastructural mess required to get a single secure channel in place, it becomes a logistical nightmare to expand the footprint to multiple sources. And that does not even include the additional administrative effort required to monitor the framework or any related authentication mechanisms needed in the chain.
And we have yet to address performance! Running direct queries against a remote data source over VPN is horrifically slow because technologies like JDBC and ODBC were not designed for long-distance communications. And the bigger the load, the worse the outcome—which seems to contradict the high-scalability promise of the cloud.
No wonder no one is fully committed to the cloud.
To solve the puzzle, Pyramid has released its Pulse Server application. Designed as a lightweight, on-premises server ‘spoke’ that works with a cloud-hosted Pyramid Application ‘hub’. It offers a variety of breakthroughs in the hybrid deployment of business analytics that make it a compelling choice for the job:
It goes without saying that Pulse supports all the SQL-based direct querying data sources supported by the main Pyramid application out of the box—making it compatible with the top data stacks in the market today. And best of all, Pulse is free with Pyramid Enterprise.
Perhaps a hybrid cloud deployment for data analytics is a good idea, after all.
To get a better sense of how this all comes together, check out these links: