The Self-Protection of the Internet World To Improve security, Security Protection Has No Boundaries, and Zero Trust Can Be Truly Safe
Knowledge

The Self-Protection of the Internet World To Improve security, Security Protection Has No Boundaries, and Zero Trust Can Be Truly Safe

Faced with various security threats, the internal network environment and the personal computers managed by the company are likely to be attacked and paralyzed, hacked, or even become the perpetrator. There is no distinction between inside and outside. Therefore, zero-trust management mode is imperative.
Published: Jun 12, 2020
The Self-Protection of the Internet World To Improve security, Security Protection Has No Boundaries, and Zero Trust Can Be Truly Safe

Pay more attention to network security in the Internet age

In the era of digitalization and networking, trust is one of the foundations for the operation of many systems, the interoperability of information, and the conduct of transactions. With the evolution of various technologies, the availability and usability of services continue to improve. How to ensure security, but it has always been a problem, and it is increasingly difficult to deal with, and it has even seriously threatened the normal operation of the real world.

According to the 2018 Global Risk Report released by the World Economic Forum, Cyber attacks ranked sixth among the top ten risks that caused serious impacts; and among the top ten possible major risks, cyber attacks entered the top three for the first time, second only to Extreme weather events are related to national disasters, and the fourth place is data fraud or theft, which is also closely related to security.

There are too many security threats to enumerate. Even after rising and facing recession, after a while, they are often recovered, evolved, or mixed with other threats, which is unpredictable. But overall, these threats can be rampant due to the rapid rise and popularization of emerging technologies, such as cloud services, mobile applications, and the Internet of Things. Moreover, the impact is overwhelming, and individuals and enterprises are irresistible and unable to resist. Stay out of the situation; at the same time, because of the wide-open door, threats across the inside and outside, such as: data leakage, DDoS, APT, privileged account abuse, etc., are also endless and difficult to completely cut off. Faced with this situation, we do not know what to believe, it is the current portrayal.

Intranet environment where boundaries no longer exist

In the design of traditional network security architecture, internally located users and application systems are generally considered to be in a trusted state. Therefore, fewer verification and verification mechanisms are implemented. However, nowadays, users and application systems are implemented in various places, and there is no distinction between inside and outside. At this time, when we look at security protection, we should adopt an attitude of continuous verification and not easy to trust.

Change in security thinking: from trust to zero trust

In the past, to strengthen information security, the IT industry has invested a lot of resources to try to enhance the trust of users. To trace the source of these efforts, we must first talk about Microsoft, they began to promote Trustworthy Computing in 2002.

At that time, the company's operating systems Windows NT 4.0 and Windows 2000 were abused because of their vulnerabilities, and suffered from network worm attacks such as Code, Nimda, and Blaster. Therefore, they painfully thought and actively strengthened the operating system and other software. Security, so since Windows XP SP2, built-in Windows Firewall, Windows Update automatically update, and promote the software development life cycle (SDL) security project.

Microsoft's follow-up in Windows Vista and Windows Server 2008, the built-in user account control (UAC) and network access protection (NAP) are also based on this concept, developed security features. In addition to Microsoft, AMD, HP, IBM, Intel and Microsoft also cooperated in 2003 to establish the Trusted Computing Group (TCG). They have promoted the development of several well-known security technologies, such as; more and more people in computers and servers Built-in Trusted Computing Module (TPM), Trusted Network Connection (TNC) providing network access control functions, and Standard Specification (SED) applied to the encryption application of the hard disk itself.

During those few years, the concepts of Layered Defense and Defense in Depth were quite popular, which attracted attention to the application of network access control (NAC), and many products of collaborative defense appeared on the market.

Later, with the rise of server virtualization platforms and cloud services, software-defined protection and cloud security products have also begun to rise to the stage, which can support the security defense strategy in these atypical network application environments implementation at that time.

For example, Check Point, a veteran network firewall manufacturer, launched Software Defined Protection in 2014, Shield, a server virtualization platform manufacturer VMware, which was released in 2011, and NSX acquired in 2012 from the acquisition of network virtualization vendor Nicira. It can provide the Network Segmentation function to reduce the possibility of horizontal attacks on the internal network. In large public cloud services, it also has multiple network partitioning mechanisms. In terms of AWS, Amazon VPC provides subnets, Security groups, network access control lists (NACLs).

In 2009, the chief analyst of the research institute Forrester proposed a new security model called Zero Trust Model, which received great attention at the time. However, there were not many security vendors that responded actively in the initial stage, mainly based on the next generation firewall the famous Palo Alto Networks. By 2015, Forrester found that the related application fields gradually expanded, covering virtualized network infrastructure (VNI) and Network orchestration solutions. In the fourth quarter of 2016, they significantly increased the threat mitigation technology based on the concept of zero trust to 20 types. However, Forrester's report released in January this year re-introduced a zero-trust product type and corresponding manufacturers. They set these parts as an extended ecosystem and distinguished them into a zero-trust platform, security automation and dispatch command, security perspective and analysis, and security levels such as personnel, workload, data, network, and networked equipment, and "staff "" is further subdivided into interactive processes and identity management. "Network" refers specifically to network partition isolation.

Network isolation is a more common structure of the zero-trust model. When Forrester started to advocate the zero-trust model in 2010, they also designed a zero-trust network architecture based on this concept, which included the integration of multiple controls and protections. Functional network isolation gateway (SG), micro-cores and boundaries (MCAP) that are divided into multiple isolation zones according to different operational attributes, a server responsible for the centralized management mechanism, and data acquisition responsible for collecting and analyzing network traffic Take the Internet (DAN).

The basic spirit of zero trust: No matter whether you trust or not, you must pass verification before counting

What exactly is zero trust? Why are more and more security vendors embracing this concept? In short, in a zero-trust network environment, all network traffic is untrusted. Therefore, security professionals must verify and protect all resources, restrict and strictly implement access control, and detect and record all Internet traffic.

According to John Kindervag's interpretation of the zero-trust network architecture in 2010, he put forward the following three core concepts: (1) on security devices, there is no longer a trust and untrust interface; (2) there is no longer a trust and Untrusted network; (3) There are no longer trusted and untrusted users.

If the zero trust model is extended to the level of information security, three basic concepts are proposed: (1) no matter where you are, you must ensure that all resources are accessed securely; (2) adopt the lowest authority Strategies, and strictly implement access control; (3) Detect and record all traffic.

In the above concept, we must assume that all network traffic is threatening. Unless they are verified by their security team to confirm that they are legally authorized, passed the test, and are protected, they will not be trusted. For example, data access between internal and external networks is often protected by encrypted channels. In contrast, cyber criminals can easily detect unencrypted data. Therefore, security experts have The protection attitude must be consistent with external data access on the Internet.

In the management of access behavior, the zero-trust model will also use more methods to verify and mitigate harm.

What we have relied on in the past is mainly protection based on user role-based access control mechanisms (RBAC), and is commonly used in network access control, infrastructure software, and identity and access management systems. If the zero-trust model is adopted, RBAC will not be the only method. The focus is on configuring the lowest access rights and implementing precise access control to reduce the impact of attacks and abuse. For example, according to the identity indicated by the user, appropriate access to resources is allocated, and they are limited to the range they need to perform these tasks, so that they can only do the right thing, instead of blindly trusting the user, without considering intention or negligence. And lead to the possibility of doing wrong.

Of course, there are no perfect and seamless restrictions in the world. To further ensure that things are done right, and to grasp the status of errors early, we also need to continuously detect and record. For example, we must have the analysis and analysis of network traffic activities. Perspective capability (NAV), and detection and recording can be carried out together to process all network traffic in an active and real-time manner, not passively and delayed for a while, and only for internal network traffic.

To meet such requirements, the zero trust model also specifically defines the characteristics that should be possessed for network traffic analysis and perspective. For example, it can flexibly expand the scale of use and the situational awareness of continuous execution, and includes network exploration (finding and Track assets), flow data analysis (analysis of traffic patterns and user behavior), packet capture and analysis, and network interpretation data analysis (analysis of network streaming packets), network forensics (to assist in emergency response and crime investigation).

In the design of zero-trust architecture, John Kindervag believes that four components need to be matched, namely: Network Segmentation Gateway (SG), Microcore and Perimeter (MCAP), centralized management, and Data Acquisition Network (DAN).

The SG will include functions that were originally dispersed in multiple security products, such as firewalls, network intrusion prevention systems, website application firewalls, network access control, content filtering gateways, VPN gateways, and other encryption products , And deeply integrated into the network level, at the same time, SG will also build a packet forwarding engine, so that it can be built in the center of the network. Security personnel can define common control policies in the SG, and must configure multiple high-speed network interfaces to undertake huge network traffic processing operations.

The SG here is quite similar to the UTM equipment we are familiar with. However, John Kindervag said that the part of UTM is the control of the network boundary. The location where the SG is deployed and targeted is at the center of the network.

MCAP is for each network partition. They are Switching Zones formed by connecting to the network interface of SG and other devices. These areas are set up with dedicated micro-core switches that share the same functions and policy control attributes. Each isolation zone becomes a Microperimeter. Security personnel can form a unified switching network by aggregating all the switches in MCAP to achieve the purpose of centralized management.

The so-called centralized management refers to the processing operations of the backplane of the network, which can be defined by the transparent and integrated MCAP overall management mechanism. Moreover, it must be upgraded from a command line operation method based on individual components to be centralized and easier to use. Management system.

As for the setting of DAN, it is to highlight the importance of capturing network data. It is also a kind of MCAP, which can deploy a security event management system (SIEM), network analysis and perspective tool (NAV), which is responsible for centralized acquisition and analysis. And log all network traffic.

Using such a network area, we can process all the traffic that passes through the SG, including the part that connects all MCAP-through mirroring and forwarding the traffic, effectively collect the packets, Syslog, SNMP and other information transmitted therein, and store it in In a single location, to facilitate the subsequent analysis and analysis of network traffic, to achieve near real-time processing requirements.

Attacks come from all directions, and the border division needs to be more detailed

In the zero-trust model, the threat of attack is more extended. Just like protecting the safety of the president, attention should be paid to the identity, location, and accessibility of the protected objects; the border control needs to set up multiple layers of partitions, such as peripheral fences and military police, and special services are surrounding the president’s car, forming For micro-boundary protection, it is also necessary to guard against remote sniper attacks and do well in endpoint control.

Application control strategy based on zero trust model

Depending on the motivation, the level of threat also changes. However, whether it is malicious or well-intentioned, endpoint devices and applications may become threats. Therefore, to deal with different levels of threats, blacklists and whitelists can be used to control However, in relatively normal and legal endpoints and applications, we can use isolation to enforce protection with the least-privileged access method.

The zero-trust model has begun to expand significantly, and can be applied at more levels

In the zero-trust research report released by Forrester at the beginning, there is a lot of attention to the network architecture. Also, network partition isolation does play a very important role here. Therefore, in recent years, whenever this concept is adopted, for product types, most people will think of next-generation firewalls. However, zero trust is not just network isolation, but the overall protection strategy, involving processing processes and technologies. Therefore, when Forrester elaborated the zero-trust model in recent years, it did not Restricted to the Internet, but extend more protection.

For example, at the beginning of this year, they specifically proposed the concept of Zero Trust eXtended Ecosystem, disassembled the zero trust model, centered on data, surrounded by four links such as personnel, equipment, network, and workload, and used the network Perspective and analysis, automation and dispatching and other technologies come together.

The number of companies that recognize and value zero trust has increased, and both Google and Netflix have begun to adopt

In addition to the research institutions that have long advocated this concept, there has been a major change in market views, and the zero-trust model has received a lot of attention recently. There are several reasons for this.

One of them is related to Google. The company's chief information officer said that large enterprises should establish a "zero trust" infrastructure. Google also released a research paper "BeyondCorp: A New Approach to Enterprise Security" this year, and subsequently published several papers that explored the concept and implementation of BeyondCorp. Since 2017, Google has provided new cloud services in this self-developed zero-trust security framework, and has begun to actively promote it. For example, at the Google Cloud Next conference, Cloud Identity-Aware Proxy ( Cloud IAP), allowing users to build a content-aware secure access environment described by BeyondCorp; Google is proud of this research project that began in 2011, claiming to allow most employees to work on untrusted networks every day In the road environment, you can work safely without using VPN protection.

In addition to Google, recently, we also saw another network company that built a zero-trust architecture, which is the well-known online audio and video industry Netflix.

At the Usenix Enigma conference earlier this year, the company's senior security engineer Bryan Zimmer publicly introduced their zero-trust architecture, called Location Independent Security Approach (LISA), which contains three basic principles: (1) The key to trust is to use The identity of the person and the health status of the endpoint, not the location; (2) The office network cannot be trusted; (3) The device must be isolated.

In the long run, to improve the overall security protection, there should be more organizations joining the ranks of implementing the zero-trust model. With the continuous promotion of the new wave of digital transformation, the operation of application systems and services is bound to move towards a more open environment If we continue to use the traditional dichotomy (internal, external, trust, distrust) to define the scope of protection of information security, I am afraid that it will be increasingly difficult to cope with this unknown situation.

If you can thoroughly understand the reality, think and construct a zero-trust model suitable for your operating environment, through continuous verification, recording, and continuous analysis and supervision, you can effectively implement "control" and "protection" instead of defensive strategies. Only by focusing on either end, we can also get a solid guarantee.

Google invests in the development of a zero-trust model

The zero-trust model is not only advocated by information security vendors. Cloud service giant Google has invested. They have spent six years developing a zero-trust security framework called BeyondCorp. In terms of access control, they have strengthened the protection from the original network boundary to the control of individual devices and users. At present, they can allow their employees to connect to the company's application system from anywhere. There is no need for encrypted connections via traditional VPN.

Published by Jun 12, 2020 Source :ithome

Further reading

You might also be interested in ...

Headline
Knowledge
Medical Consumables: Global Guardians of Health
Medical consumables are a wide range of products used by healthcare professionals on a daily basis, typically for a single use before being disposed of. Their primary purpose is to ensure patient care, maintain hygiene, and prevent the spread of infection. These items are crucial for everything from routine checkups to complex surgical procedures.
Headline
Knowledge
Closed Suction System: Revolutionizing Respiratory Care
In critical care, airway management is a vital part of sustaining a patient's life. When patients rely on ventilators, clearing respiratory secretions becomes a crucial aspect of daily care. This seemingly simple, yet critically important, procedure has undergone significant evolution over the past few decades, progressing from early open suctioning to today's more advanced and safer Closed Suction System (CSS).
Headline
Knowledge
Understanding Plastic Materials: A Professional Analysis and Application Guide
Plastic materials, due to their diverse properties and wide range of applications, have become indispensable in modern industries and daily life. Choosing the right plastic material for different needs is crucial for optimizing product performance and achieving environmental benefits. The following is a professional review of the characteristics, applications, and pros and cons of the main plastic materials.
Headline
Knowledge
Exploring Rubber Processing Technology: Core and Challenges of Modern Manufacturing
Rubber processing is one of the most critical stages in modern manufacturing. From vehicle tires to industrial equipment seals and various consumer goods, rubber materials are everywhere. As the demand for high-quality and efficient products rises, rubber processing technologies continue to evolve. This article explores the basic knowledge of rubber processing, key technologies, and future trends.
Headline
Knowledge
Understanding the Coffee Robot: A Comprehensive Analysis
This article provides a comprehensive overview of coffee robots—automated machines that brew and serve coffee using advanced robotics and artificial intelligence. It outlines their key features, including AI-driven customization, app connectivity, 24/7 efficiency, and diverse drink options. The report also examines their growing impact on the coffee industry, highlighting benefits for both consumers and businesses such as convenience, consistency, and reduced labor costs. Case studies like CafeXbot, Artly Coffee, and Rozum Café illustrate how coffee robots are reshaping the coffee experience and driving market growth worldwide.
Headline
Knowledge
Understanding PU Foam: Properties, Types, and Industrial Uses
PU foam is no longer merely a cushioning material. It has become a core functional component across sports, medical, fashion, and lifestyle industries. By adjusting density, thickness, and surface feel, PU can meet diverse requirements for breathability, antimicrobial performance, durability, and comfort. It also aligns with brand trends toward eco-friendly formulations and recyclable material solutions.
Headline
Knowledge
Understanding Helical Filters: A Comprehensive Overview
Helical filters are essential components in radio frequency (RF) and microwave engineering, playing a key role in signal filtering and processing. Known for their compact size, high Q-factor, and broad frequency range, these filters are widely used across various industries. This report provides an in-depth look at helical filters, including their structure, operating principles, advantages, limitations, and typical applications.
Headline
Knowledge
Boost Your Device’s Performance: A Guide to Choosing the Right Power Supply
Choosing the right power supply unit (PSU) is crucial for maximizing your device's performance, ensuring stability, and prolonging the lifespan of your components. A PSU is not just a simple component that provides power; it is the heart of your system that ensures each component receives the right amount of power safely and efficiently. This report will guide you through the essential considerations and steps to select the ideal PSU for your needs.
Headline
Knowledge
How to Choose the Ideal Wood Screws for Furniture and Cabinetry
Selecting the right wood screws is essential to building strong, stable, and visually appealing furniture or cabinets. Key factors include screw size, length, thread type, head style, and compatibility with different wood materials. Coarse threads suit softwoods, while fine threads are better for hardwoods. Choosing the proper head type ensures both function and aesthetics, while accounting for environmental changes helps maintain joint integrity. Pre-drilling pilot holes can also prevent splitting, especially in dense wood. By understanding these considerations, woodworkers can achieve durable, high-quality results in their projects.
Headline
Knowledge
How Effective Coolant Management Promotes Sustainable CNC Machining
Sustainable CNC machining increasingly relies on effective coolant management to reduce environmental impact, cut costs, and improve machining performance. Coolants are essential for lubrication, heat control, and chip removal, but improper handling leads to waste and higher expenses. Proper management practices—such as regular monitoring, filtration, recycling, automation, and using eco-friendly coolants—help extend coolant life, maintain machine health, and ensure consistent product quality. Although initial investment may be a barrier, the long-term benefits include cost savings, reduced waste, and enhanced operational efficiency. Future advancements in IoT and AI are expected to further optimize coolant systems, reinforcing sustainability in CNC machining.
Headline
Knowledge
A Complete Guide to Selecting the Ideal Paper Cups for Hot Beverages
This guide provides a detailed overview of how to choose the best paper cups for hot beverages. It explores the different types of cups—single-wall, double-wall, insulated, and eco-friendly—and explains their unique features and ideal use cases. Key factors to consider include beverage temperature, insulation needs, cup size and lid compatibility, environmental impact, and safety standards. The article also outlines best practices for both consumers and businesses to ensure safe use and responsible disposal. Ultimately, selecting the right paper cup depends on balancing functionality, comfort, sustainability, and cost.
Headline
Knowledge
Understanding the Difference Between Reverse Osmosis and Traditional Water Filters
An in-depth comparison between reverse osmosis (RO) and traditional water filters, two widely used methods for purifying drinking water. It outlines how RO uses a semi-permeable membrane to remove dissolved salts, heavy metals, and microorganisms, making it ideal for areas with highly contaminated water. In contrast, traditional filters rely on physical and chemical filtration - often using activated carbon - to improve taste and remove larger particles. While RO systems offer superior contaminant removal, they come with higher costs and water usage. Traditional filters are more affordable and environmentally friendly but less effective against microscopic impurities. The article concludes that the best choice depends on specific water quality needs, and in some cases, combining both systems can offer the most comprehensive solution.
Agree