Rebuilding the Post-Pandemic Architecture for Remote Workers - Part 2
May 12, 2020

Rich Weber
Panzura

There are multiple stages that the organization can take to adapt to the new normal of WFH. A new normal that will change IT forever — the way it is used, implemented, and valued. This is the frontline where remote working solutions and cloud platforms and technology solutions will be forged in the flames of necessity and demand.

Start with: Rebuilding the Post-Pandemic Architecture for Remote Workers - Part 1

Start with the data

Data has gravity. It takes a long time to move and manage. So, you need solutions that make getting the data into the cloud easy by moving there in steps. Consider a hybrid model that puts the data in the cloud but accesses it through on-prem technology. A hybrid model can also allow the business to move selective workflows, applications, or users into the cloud gradually. It doesn't have to be a petabyte of data dumped into the cloud in one fell swoop, and it can be steadily moved one byte at a time. But the first bytes must be dictated by a clearly defined strategy that allows the business to move everything else eventually.

Preserve your workflow

Don't change your security paradigm or how users access and authenticate the data. This is absolutely critical, as is ensuring that it is secure. It has to be locked down, encrypted, and has to factor in the risks.

Ensure data availability

Business users work with unstructured data, and the challenge is to ensure that they have access to the same data at home as they did in the office. This requires getting the data into the cloud and using technology that pushes it closer to the end user. This can be a site or a cloud region that's critical for getting the user data. Then, you need to extend those desktops to a home or remote location in a performant way. The data has to be a ubiquitous access layer that allows for data to be accessible across multiple geographies and time zones. This means that whether the demand for data is a simple document retrieval or massive design file from a high-performance collaborative workstation, the performance is the same.

First, consider cloud VDI to manage the high-performance requirements as it allows you to extend a powerful workstation to a tablet in a coffee shop. The technology is there. To make the data a ubiquitous access layer, you can use a cloud file solution that makes the same data accessible in real time. It’s a combination of taking advantage of technology and leveraging it to create the work sweet spot. You have to make access to data fast, or you'll only solve one problem while creating others such as data collisions and difficulties with search.

Put a filer into any compute cloud

This allows users and applications to have access to the same data in any compute cloud. When you move application and workflows into the cloud, that cloud is no different from a hybrid on-premise site. When the enterprise reads the data between the clouds, it's the same as reading it over the internet, which can incur high costs across storage and usage. A filer caches everything locally, which means they can remove the need to do a remote cloud read, which immediately saves money on charges and reduces latency.

Reconsider your reluctance when it comes to a cloud-first strategy

If you didn't have this strategy, to begin with, if you said the sun always shines at your company, you may have to reconsider and start implementing post-haste.

WFH Status: It’s Complicated

Remote working isn't new. Traveling workers, full-time remote workers, part-time telecommuters — these roles have been steadily evolving and compounding year-on-year because organizations could see the advantages in terms of access to talent and employee productivity. However, until recently, most companies didn't have 100% of their workforce working from home, as they do today. Maybe 10-20% were granted that golden ticket. The infrastructure was in place for this 20%, and few corporate business continuity plans thought — what will happen if we send everybody at every global location home at the same time?

Why would they? Disasters are typically localized. Today, this has fundamentally changed. Today, the business has to look at its continuity plan and say, "I need a contingency for a global shut down because this can happen again."

However, building that contingency to support 100% of the workforce changes the investment parameters. The business has had to ensure that its entire workforce can work from home and has invested in resources that allow for it. Now, what happens when the pandemic subsides? If the business drops back to 20% remote working, then it's a sunk investment.

Companies that prove the WFH model works are very likely to now adopt progressive remote working plans that leverage the architecture and the benefits that working from home brings. It may seem a dire and costly outlook in light of the economy and lost income, but these investments can help organizations save money. If integration is accessible and replication designed for redundancy and data consolidated intelligently, then your business has invested in resiliency and technology that will pay for itself.

Rich Weber is President of Panzura
Share this

Industry News

April 17, 2025

GitLab announced the general availability of GitLab Duo with Amazon Q.

April 17, 2025

Perforce Software and Liquibase announced a strategic partnership to enhance secure and compliant database change management for DevOps teams.

April 17, 2025

Spacelift announced the launch of Saturnhead AI — an enterprise-grade AI assistant that slashes DevOps troubleshooting time by transforming complex infrastructure logs into clear, actionable explanations.

April 16, 2025

CodeSecure and FOSSA announced a strategic partnership and native product integration that enables organizations to eliminate security blindspots associated with both third party and open source code.

April 16, 2025

Bauplan, a Python-first serverless data platform that transforms complex infrastructure processes into a few lines of code over data lakes, announced its launch with $7.5 million in seed funding.

April 15, 2025

Perforce Software announced the launch of the Kafka Service Bundle, a new offering that provides enterprises with managed open source Apache Kafka at a fraction of the cost of traditional managed providers.

April 14, 2025

LambdaTest announced the launch of the HyperExecute MCP Server, an enhancement to its AI-native test orchestration platform, HyperExecute.

April 14, 2025

Cloudflare announced Workers VPC and Workers VPC Private Link, new solutions that enable developers to build secure, global cross-cloud applications on Cloudflare Workers.

April 14, 2025

Nutrient announced a significant expansion of its cloud-based services, as well as a series of updates to its SDK products, aimed at enhancing the developer experience by allowing developers to build, scale, and innovate with less friction.

April 10, 2025

Check Point® Software Technologies Ltd.(link is external) announced that its Infinity Platform has been named the top-ranked AI-powered cyber security platform in the 2025 Miercom Assessment.

April 10, 2025

Orca Security announced the Orca Bitbucket App, a cloud-native seamless integration for scanning Bitbucket Repositories.

April 10, 2025

The Live API for Gemini models is now in Preview, enabling developers to start building and testing more robust, scalable applications with significantly higher rate limits.

April 09, 2025

Backslash Security(link is external) announced significant adoption of the Backslash App Graph, the industry’s first dynamic digital twin for application code.

April 09, 2025

SmartBear launched API Hub for Test, a new capability within the company’s API Hub, powered by Swagger.

April 09, 2025

Akamai Technologies introduced App & API Protector Hybrid.