It can be tempting to think that your company is safe because you’re invested in a tech platform that displays a smaller attack surface to the network. For instance, Macintosh users have claimed for years that they’re in a safer position than those who use most other operating systems. The recent switch to M1-type microprocessors was heralded as a further step forward that would make the platform even more secure, since these are based on the ARM architecture and therefore immune to the various security bugs that have plagued Intel’s chips in recent years.
Security researchers have now proven that it’s possible to spread malware via the Xcode environment to computers running macOS 11 on M1 processors. This has thrown a major curve ball to those who thought that their systems were safe because they weren’t Intel-based. No matter what kind of technology you’re using, there’s going to be some risk inherent in connecting it to a network. You’d only be truly safe if you never got information into or out of a machine.
Since few people would ever want to run a business like that, you’ll more than likely want to put at least some kind of mitigation strategy in place.
The above example of security by obscurity is a valid strategy for an IT department to try, but even a company that’s pursuing it should still consider themselves at risk. A firm engaged in edge computing that uses all custom cloud apps could still theoretically have some kind of zero-day exploit that would remain undiscovered until, suddenly, a bad actor stumbled upon it while trying to gain access to their storage services. When that happens, there’s a good chance that they could execute arbitrary code.
Implementing foundational and organizational cyber security controls is vital when it comes to reducing your firm’s risk of falling prey to bad actors. According to a list of the top 20 CIS critical security controls, creating an active inventory of all of the physical hardware devices connected to a network is the most basic thing an IT department should do in order to mitigate the potential of cyber attack. This inventory needs to be regularly updated. If something seems amiss, then there’s a good chance that someone has unauthorized access to a network.
Only when this is complete should IT department staffers ever start to track software considerations. Virtualization has become a hot button issue in the last few years, and the massive growth of virtual private servers has started to diminish the importance of physical hardware. That being said, even the most sophisticated VPS has to run on something, so it’s important IT staffers take note of everything that’s connected to their organization’s network. Pay close attention to everyone who has physical access to your facilities, as well. Before you say that physical attacks are a thing of the past, consider the fact that at least one bad actor used a drone flight path to gain access to network printers.
Most of the other controls an organization should put in place are much less onerous than this, however, so you might not run into as much difficulty as you’d otherwise think.
So called 0-day exploits are among the most difficult for IT departments to contend with, because there’s always a strong possibility that all of the software a company is running could be compromised without anyone realizing it. The good news is that enforcing a policy of regular updates is enough to deal with even complex problems, like those related to the recent desktop window manager bug. A much bigger risk comes from individual users relying on their tech at work.
A bring your own tech policy can be really helpful, but you never know quite what your staffers might be doing with their machines outside of work. Few companies want to have to issue corporate devices to every single individual if they already have phones and laptops that they could be using at work, but you’ll want to put at least some sort of mitigation in place to deal with the added risk that comes with connecting a whole bunch of mobile devices to a single private network.
The most recent numbers anyone seems to have suggests that 65 percent of IT departments still haven’t automated their firewalls and another 38 percent continue to use ad hoc methods to report potential security issues. While you don’t have to incorporate the most faddish strategies around, you will want to keep abreast of any changes in the industry.
Most importantly, you’ll want to make sure that everyone else on your team gets a chance to communicate their issues. Including all of your business’ departments will help to keep everybody on the same page at all times.
Best DNS Management Tools play a crucial role in efficiently managing domain names and their…
Customers can now easily integrate Sweet’s runtime detection and response platform into their AWS environments…
Cybercriminals exploit leaked credentials, obtained through various means, to compromise systems and data, enabling them…
SpyAgent, a newly discovered Android malware, leverages OCR technology to extract cryptocurrency recovery phrases from…
In late October 2024, a coordinated IP spoofing attack targeted the Tor network, prompting abuse…
The Metasploit Framework, a widely used open-source penetration testing tool maintained by Rapid7, has introduced…