Malware adds Sandboxing to evade analysis

Any.Run is a malware analysis sandbox service that lets researchers and users safely analyze malware without risk to their computers.

When an executable is submitted to Any.Run, the sandbox service will create a Windows virtual machine with an interactive remote desktop, and execute the submitted file within in it.

Researchers can utilize the interactive Windows desktop to see what behavior the malware is exhibiting, while Any.Run records its network activity, file activity, and registry changes.

In a new password-stealing trojan spam campaign discovered, malicious PowerShell scripts are downloading and installing malware onto a computer.

If it detects that the program is running on Any.Run, it will display the message ‘Any.run Deteceted!’ and exit. This will cause the malware to not be executed so that the sandbox cannot analyze it

Using this method, threat actors make it more difficult for researchers to analyze their attacks using an automated system.

When executed on a normal virtual machine, or a live system, the password-stealing Trojan would be allowed to execute and steal saved login credentials in browsers, FTP programs, and other software.

While this will not prevent a researcher from analyzing a particular malware using other methods, it does cause them to have to put more effort into the analysis.

With online malware analysis sandbox platforms becoming more commonly used by security researchers, we can expect to see more malware continue to target them.

Cloud vs On Premises . ! Debating while in New Norm

The topic suggests it has a long debate in backyards.. years back .. when cloud came has a newbie in market , debate started. Price will be high , Data leak.. security … Compliance…. etc..etc… But now after Covid let’s see if the mentality has changed or remains the same . !!!!

Despite the overwhelming momentum of cloud, IT organisations are still paralyzed by the “cloud” vs. “on-premises” debate. Myths about cost, security, and data protection derail cloud initiatives, while other companies gain competitive advantage from cloud’s flexibility. By understanding the true costs and benefits of the cloud, businesses can make an informed decision and prepare for the future.

Cost Effectiveness

Discussions about cloud costs tend to be both extreme and vague. One side touts superscalar users that save mind-boggling sums of money. The other side points to horrifying cloud bills bursting with “hidden fees.” The middle ground is littered with platitudes like “Cloud works well for ‘dynamic applications’” which many take to mean that only applications using Kubernetes, serverless functions, and object-storage could possibly be cost-effective.

Even without re-architecting applications, companies should be able to save money in the cloud with three different initiatives. First, move to SaaS applications for core services. SaaS providers like Druva create highly optimized cloud applications, so they can pass the savings to the customers. Second, shut down smaller data centers and move their workloads to cloud.

Even without tuning the applications for cloud, the operational and capital savings of eliminating data centers outstrips any inefficiencies. The third initiative is to migrate applications that are over-provisioned or running on the wrong type of data center infrastructure. Cloud offers a chance to optimize even static applications with a broader menu of compute and online storage options than most companies can run in their data centers.

The biggest secret to extracting value from the cloud is to create momentum. Don’t waste time debating the 10% of applications that will be difficult to move. Just as some companies never fully virtualized, some will never fully move to the cloud. Fortunately, as you migrate simple workloads to the cloud, you will save money, free up capacity for on-premises applications, and build a “cloud-first” organisation.

Security and Compliance

The question is not whether the cloud is more secure than on-premises, but how businesses can best use the cloud to improve their security. For over a decade, compliance agencies, cyber criminals, and customers have probed cloud providers for vulnerabilities. The relentless scrutiny has driven cloud and SaaS providers to invest in teams and technology to outpace potential threats.

In fact, Gartner wrote “the majority of cloud providers invest significantly in security, realizing that their business would be at risk without doing so.” As a result, cloud now has some of the highest levels of security available, backed by broad federal certifications. Conversely, most individual businesses lack the expertise, time, and staff to keep pace with sophisticated and continuously evolving attacks.

The cloud provider ensures the security of the environment, but the customer is still responsible for their data. Thus, customers should encrypt data in movement and at rest, verify that object stores are not exposed to the outside world, and manage their network policies closely. Cloud teams must also protect from internal threats, especially through network monitoring for unusual user activity and access patterns. Finally, since organisations are creating more accounts within the cloud, security checks need to be automated, so they can scale with the cloud environment.

A secure cloud can then improve cyber security for the on-premises environment. When hackers breach a data center, they attack both production and backup environments, so the company has no choice but to pay the ransom. Many IT teams are trying to retrofit air gapped backups onto their existing solution, which is an error-prone and expensive process. With a SaaS data protection provider, the data is automatically isolated and immutable, so customers can be confident their cloud backups will always be safe and rapidly recoverable.

Protecting Your Data

While customers worry too much about cost and security, they worry too little about protecting their data. While cloud providers protect the IT infrastructure, you are responsible for your data. Furthermore, data in the cloud is subject to the same litigation, compliance, and governance requirements as it was on-premises. It is also just as likely to be deleted or corrupted due to user error, application error, or malicious internal users. Many business teams do not consider the risks to their data when they use cloud, but data protection is more important than ever.

While the core protection requirements in the cloud and on-premises are the same, the users’ expectations are not. Self-service and agility are so important in the cloud that teams will not wait for “backup teams” to configure protection or run restores. Instead, protection must be built into the environment, so that new application data can be automatically backed up.

Then, in the event of an issue, application owners need to be able to run self-service recoveries from their application’s interface. Meanwhile, in the background, the cloud team should centrally manage creating, securing, and retaining the backups. Successful organisations protect both data centers and cloud, but they evolve their legacy technology and processes to meet their cloud teams’ needs.

Business flexibility

Once a customer has shifted to a “cloud-first” environment, they uncover the true competitive advantage – the sheer speed of scaling resources up and down. While it takes months to procure and install on-premises infrastructure, a team can provision cloud capacity in minutes. More importantly, unlike capital expenditures, cloud capacity can also be released in minutes.

Organisations worry about the risk of investing either too early or too late to take advantage of a resurgent economy. If they invest too late, their competitors bypass them. If they invest too early, they can be forced to make deep cuts. Therefore, most companies move cautiously. With the flexibility of the cloud, however, market leaders are preparing to move aggressively.

Once you get past the myths to the truth about the cost, security, and protection of the cloud, you can see the value of its flexibility. Cloud shifts infrastructure from a cost center to a strategic platform that helps companies embrace opportunities in an uncertain future. Those that thrive in a period of change will be agile, secure and able to scale at speeds that out-compete rivals. Done right, scaling with the cloud can help companies become the new market leaders.

Mozilla cut is Certificate life Span

Mozilla is planning to complement the change in the coming months, regardless of the outcome of a vote on the issue by a key industry group.

The CA/Browser Forum, which sets policies for certificate authorities and browser makers, has been considering the change for some time and the proposal has significant support among the browser vendors. An updated version of the proposal that would reduce the lifespan of TLS certificates to a maximum of 398 days is active now.

Currently, the policy allows for a maximum lifespan of 825 days, or about 27 months. A lot can change in that amount of time, and that’s one of the main reasons that Mozilla and other companies are supporting the change. TLS certificates serve several purposes, including the enablement of encrypted sessions between clients and the site,

“TLS certificates provide authentication, meaning that you can be sure that you are sending information to the correct server and not to an imposter trying to steal your information. If the owner of the domain changes or the cloud service provider changes, the holder of the TLS certificate’s private key (e.g. the previous owner of the domain or the previous cloud service provider) can impersonate the website until that TLS certificate expires,” Ben Wilson, technical program manager at Mozilla, said in a post detailing the company’s position.

“Keys valid for longer than one year have greater exposure to compromise.”

Long lifespans for TLS certificates can be problematic in a number of ways aside from the potential for impersonation. In order to provide compatibility with various browsers and client systems, certificates support several ciphersuites for encryption and hash algorithms for signatures. That’s all fine until there’s a serious issue with one of the ciphersuites or hask algorithms that necessitates revoking and reissuing certificates. This is a relatively rare occurrence, but when it happens it’s a major disruption for site owners, CAs, and individuals trying to make a secure connection to an affected site.

In recent years, collisions discovered with both  SHA-1 and MD5 hash algorithms put certificates signed with one of those algorithms in jeopardy for forgery. The issues were public, but because of the long lifespans of TLS certificates at the time the collisions were disclosed, it took many years to phase out all of the affected certificates. Reducing the lifespan of certificates would mitigate this kind of problem while also limiting the amount of time a given keypair is valid.

Keys valid for longer than one year have greater exposure to compromise, and a compromised key could enable an attacker to intercept secure communications and/or impersonate a website until the TLS certificate expires. A good security practice is to change key pairs frequently, which should happen when you obtain a new certificate.

The current proposal would have the 398 day lifespan go into effect on Sept. 1 if it passes. But even if the proposal fails,Mozilla intends to change its policy to limit certificate lifespans to 398 days

Recently Apple announced that it will enforce the TLS Policy by September 2020

USB … Strategical Carrier of Risk.

There’s no denying the convenience of USB media. From hard drives and flash drives to a wide range of other devices, they offer a fast, simple way to transport, share and store data. However, from a business security perspective, their highly accessible and portable nature makes them a complete nightmare, with data leakage, theft, and loss all common occurrences.

Widespread remote working appears to have compounded these issues. According to new research, there’s been a 123% increase in the volume of data downloaded to USB media by employees since the onset of COVID-19, suggesting many have used such devices to take large volumes of data home with them. As a result, there’s hundreds of terabytes of potentially sensitive, unencrypted corporate data floating around at any given time, greatly increasing the risk of serious data loss.

Fortunately, effective implementation of USB control and encryption can significantly minimize that risk.

What is USB control and encryption?

USB control and encryption refers to the set of techniques and practices used to secure the access of devices to USB ports. Such techniques and practices form a key part of endpoint security and help protect both computer systems and sensitive data assets from loss, as well as security threats (e.g., malware) that can be deployed via physical plug-in USB devices.

There are numerous ways that USB control and encryption can be implemented. The most authoritarian approach is to block the use of USB devices altogether, either by physically covering endpoint USB ports or by disabling USB adapters throughout the operating system. While this is certainly effective, for the vast majority of businesses it simply isn’t a workable approach given the huge number of peripheral devices that rely on USB ports to function, such as keyboards, chargers, printers and so on.

Instead, a more practical approach is to combine less draconian physical measures with the use of encryption that protects sensitive data itself, meaning even if a flash drive containing such data is lost or stolen, its contents remain safe. The easiest (and usually most expensive) way to do this is by purchasing devices that already have robust encryption algorithms built into them.

A cheaper alternative is to implement and enforce specific IT policies governing the use of USB devices. This could either be one that only permits employees to use certain “authenticated” USB devices – whose file systems have been manually encrypted – or stipulating that individual files must be encrypted before they can be transferred to a USB storage device.

Greater control means better security
The default USB port controls offered as part of most operating systems tend to be quite limited in terms of functionality. Security teams can choose to leave them completely open, designate them as read-only, or fully disable them.

With the help of USB control applications, admins can use this information to limit or block certain types of USB devices on specific endpoint ports. A good example would be permitting the use of USB-connected mice via the port, but banning storage devices, such as USB sticks, that pose a much greater threat to security.

Some control applications go further still, allowing security teams to put rules in place that govern USB ports down to an individual level. This includes specifying exactly what kinds of files can be copied or transferred via a particular USB port or stipulating that a particular port can only be used by devices from a pre-approved whitelist (based on their serial number). Such controls can be extremely effective at preventing unauthorized data egress, as well as malicious actions like trying to upload malware via an unauthorized USB stick.

A centrally controlled solution saves significant logistical headaches
It’s worth noting that a normal business network can contain hundreds, or even thousands of endpoints, each with one or more USB ports. As such, control and encryption solutions that can be managed centrally, rather than on an individual basis, are significantly easier to implement and manage. This is particularly true at this current point in time, where remote working protocols make it almost impossible to effectively manage devices any other way.

While portable USB drives and devices are seen as a quick, convenient way to transport or store data by employees, they often present a major headache for security professionals.

Fortunately, implementing USB control and encryption solutions can greatly improve the tools at a security team’s disposal to deal with such challenges and ensure both the network and sensitive company data remains protected at all times.