Microsoft 365 was down Monday evening, affecting users’ new access request to multiple services including Outlook, Word, Excel and Microsoft Teams.
“We’re investigating an issue affecting access to multiple Microsoft 365 services,” the Microsoft 365 Status account tweeted Monday at 5:44 p.m. ET. “We’re working to identify the full impact and will provide more information shortly.”
“Users may be unable to access multiple Microsoft 365 services,” the software giant posted on its Office status website.
The company determined that a specific portion of its infrastructure was not processing authentication requests in a timely manner. “We’re pursuing mitigation steps for this issue,” the status update said.
Microsoft Office program users who were already logged in would be able to continue their sessions, the company confirmed.
Microsoft Office outage reports began coming in at 5 p.m. ET Monday at online traffic site DownDetector. Some users began reporting a return of service about 8:30 p.m. ET on the site.
The outage stopped work for some, but created more work for some: IT specialists. “The #Office365 outage is generating tickets like crazy,” tweeted one. “I have just told 5 people in a row: ‘No I cannot fix it. Microsoft is working on it.”
But others on Twitter had fun at Microsoft’s expense. “There’s a global 365 outage affecting microsoft outlook, i guess we won Monday after all.”
Another Twitter user posted an a global outage map, noting “The Microsoft 365 Azure Outage isn’t that bad, it’s only down in places with people that are awake.”
The topic suggests it has a long debate in backyards.. years back .. when cloud came has a newbie in market , debate started. Price will be high , Data leak.. security … Compliance…. etc..etc… But now after Covid let’s see if the mentality has changed or remains the same . !!!!
Despite the overwhelming momentum of cloud, IT organisations are still paralyzed by the “cloud” vs. “on-premises” debate. Myths about cost, security, and data protection derail cloud initiatives, while other companies gain competitive advantage from cloud’s flexibility. By understanding the true costs and benefits of the cloud, businesses can make an informed decision and prepare for the future.
Discussions about cloud costs tend to be both extreme and vague. One side touts superscalar users that save mind-boggling sums of money. The other side points to horrifying cloud bills bursting with “hidden fees.” The middle ground is littered with platitudes like “Cloud works well for ‘dynamic applications’” which many take to mean that only applications using Kubernetes, serverless functions, and object-storage could possibly be cost-effective.
Even without re-architecting applications, companies should be able to save money in the cloud with three different initiatives. First, move to SaaS applications for core services. SaaS providers like Druva create highly optimized cloud applications, so they can pass the savings to the customers. Second, shut down smaller data centers and move their workloads to cloud.
Even without tuning the applications for cloud, the operational and capital savings of eliminating data centers outstrips any inefficiencies. The third initiative is to migrate applications that are over-provisioned or running on the wrong type of data center infrastructure. Cloud offers a chance to optimize even static applications with a broader menu of compute and online storage options than most companies can run in their data centers.
The biggest secret to extracting value from the cloud is to create momentum. Don’t waste time debating the 10% of applications that will be difficult to move. Just as some companies never fully virtualized, some will never fully move to the cloud. Fortunately, as you migrate simple workloads to the cloud, you will save money, free up capacity for on-premises applications, and build a “cloud-first” organisation.
Security and Compliance
The question is not whether the cloud is more secure than on-premises, but how businesses can best use the cloud to improve their security. For over a decade, compliance agencies, cyber criminals, and customers have probed cloud providers for vulnerabilities. The relentless scrutiny has driven cloud and SaaS providers to invest in teams and technology to outpace potential threats.
In fact, Gartner wrote “the majority of cloud providers invest significantly in security, realizing that their business would be at risk without doing so.” As a result, cloud now has some of the highest levels of security available, backed by broad federal certifications. Conversely, most individual businesses lack the expertise, time, and staff to keep pace with sophisticated and continuously evolving attacks.
The cloud provider ensures the security of the environment, but the customer is still responsible for their data. Thus, customers should encrypt data in movement and at rest, verify that object stores are not exposed to the outside world, and manage their network policies closely. Cloud teams must also protect from internal threats, especially through network monitoring for unusual user activity and access patterns. Finally, since organisations are creating more accounts within the cloud, security checks need to be automated, so they can scale with the cloud environment.
A secure cloud can then improve cyber security for the on-premises environment. When hackers breach a data center, they attack both production and backup environments, so the company has no choice but to pay the ransom. Many IT teams are trying to retrofit air gapped backups onto their existing solution, which is an error-prone and expensive process. With a SaaS data protection provider, the data is automatically isolated and immutable, so customers can be confident their cloud backups will always be safe and rapidly recoverable.
Protecting Your Data
While customers worry too much about cost and security, they worry too little about protecting their data. While cloud providers protect the IT infrastructure, you are responsible for your data. Furthermore, data in the cloud is subject to the same litigation, compliance, and governance requirements as it was on-premises. It is also just as likely to be deleted or corrupted due to user error, application error, or malicious internal users. Many business teams do not consider the risks to their data when they use cloud, but data protection is more important than ever.
While the core protection requirements in the cloud and on-premises are the same, the users’ expectations are not. Self-service and agility are so important in the cloud that teams will not wait for “backup teams” to configure protection or run restores. Instead, protection must be built into the environment, so that new application data can be automatically backed up.
Then, in the event of an issue, application owners need to be able to run self-service recoveries from their application’s interface. Meanwhile, in the background, the cloud team should centrally manage creating, securing, and retaining the backups. Successful organisations protect both data centers and cloud, but they evolve their legacy technology and processes to meet their cloud teams’ needs.
Once a customer has shifted to a “cloud-first” environment, they uncover the true competitive advantage – the sheer speed of scaling resources up and down. While it takes months to procure and install on-premises infrastructure, a team can provision cloud capacity in minutes. More importantly, unlike capital expenditures, cloud capacity can also be released in minutes.
Organisations worry about the risk of investing either too early or too late to take advantage of a resurgent economy. If they invest too late, their competitors bypass them. If they invest too early, they can be forced to make deep cuts. Therefore, most companies move cautiously. With the flexibility of the cloud, however, market leaders are preparing to move aggressively.
Once you get past the myths to the truth about the cost, security, and protection of the cloud, you can see the value of its flexibility. Cloud shifts infrastructure from a cost center to a strategic platform that helps companies embrace opportunities in an uncertain future. Those that thrive in a period of change will be agile, secure and able to scale at speeds that out-compete rivals. Done right, scaling with the cloud can help companies become the new market leaders.
Software Defined Perimeter (SDP) is the most effective architecture for adopting a zero trust strategy, an approach that is being heralded as the breakthrough technology for preventing large-scale breaches.
SDP zero trust
“Most of the existing zero trust security measures are applied as authentication and sometimes authorization, based on policy after the termination of Transport Layer Security (TLS) certificates,” .
“Network segmentation and the establishment of micro networks, which are so important for multi-cloud deployments, also benefit from adopting a software-defined perimeter zero trust architecture.”
SDP improves security posture
A zero trust implementation using SDP enables organizations to defend new variations of old attack methods that are constantly surfacing in existing network and infrastructure perimeter-centric networking models.
Implementing SDP improves the security posture of businesses facing the challenge of continuously adapting to expanding attack surfaces that are, in turn, increasingly more complex.
Network security implementation issues The report notes particular issues that have arisen that require a rapid change in the way network security is implemented, including the:
1.Changing perimeter 2.IP address challenge, 3.Challenge of implementing integrated controls.
Zscaler Inc. is doubling down in its drive to dominate the market for “zero trust” security frameworks with its second acquisition in about six weeks.
The cloud security specialist is acquiring Edgewise Networks, a four-year-old Boston area startup focused on securing communications among applications running in cloud and datacenter networks.
The acquisition of Edgewise Network addresses growing enterprise requirements to detect security threats that can spread rapidly across a network from a single compromised server. The startup’s tools focus on securing so-called “east-west,” or lateral, network traffic by verifying application software and other services.
The result, Zscaler said, is a zero-trust environment in which no one inside or outside a network is trusted by default. The security approach is said to reduce cloud and datacenter attack surfaces, thereby reducing data breaches and application hacks.
The startup’s zero-trust approach discovers individual applications and their legitimate communication patterns. AI and machine learning algorithms are then used to automatically enforce authorized communication to provide a security layer called application segmentation. That approach isolates distinct service tiers from one another within an application to create security boundaries that reduce exposure to attacks originating from other applications.
“Edgewise is highly innovative technology that enables application segmentation without having to do traditional network segmentation which is often done with virtual firewalls,” .
The zero-trust security framework is geared to the growing number enterprise multi-cloud deployments that increasing use micro-services to deliver distributed applications. The many moving parts create more opportunities for security breaches via compromised servers and applications.
The Edgewise framework uses a technique called software identity verification to secure network traffic carried across public and hybrid clouds, datacenter and application containers.