As computing technology evolved and the concept of linking multiple computers together into a “network” that could share data came into being, it was clear that a model was needed to define and enable those connections. Enter the OSI model in computer network idea.


This model allows various devices and software to “communicate” with one another by creating a set of universal rules and functions. Let’s dig into what the model entails.


History of the OSI Model


In the late 1970s, the continued development of computerized technology saw many companies start to introduce their own systems. These systems stood alone from others. For example, a computer at Retailer A has no way to communicate with a computer at Retailer B, with neither computer being able to communicate with the various vendors and other organizations within the retail supply chain.


Clearly, some way of connecting these standalone systems was needed, leading to researchers from France, the U.S., and the U.K. splitting into two groups – The International Organization for Standardization and the International Telegraph and Telephone Consultive Committee.


In 1983, these two groups merged their work to create “The Basic Reference Model for Open Systems Interconnection (OSI).” This model established industry standards for communication between networked devices, though the path to OSI’s implementation wasn’t as clear as it could have been. The 1980s and 1990s saw the introduction of another model – The TCP IP model – which competed against the OSI model for supremacy. TCP/IP gained so much traction that it became the cornerstone model for the then-budding internet, leading to the OSI model in computer network applications falling out of favor in many sectors. Despite this, the OSI model is still a valuable reference point for students who want to learn more about networking and still have some practical uses in industry.


The OSI Reference Model


The OSI model works by splitting the concept of computers communicating with one another into seven computer network layers (defined below), each offering standardized rules for its specific function. During the rise of the OSI model, these layers worked in concert, allowing systems to communicate as long as they followed the rules.


Though the OSI model in computer network applications has fallen out of favor on a practical level, it still offers several benefits:


  • The OSI model is perfect for teaching network architecture because it defines how computers communicate.
  • OSI is a layered model, with separation between each layer, so one layer doesn’t affect the operation of any other.
  • The OSI model offers flexibility because of the distinctions it makes between layers, with users being able to replace protocols in any layer without worrying about how they’ll impact the other layers.

The 7 Layers of the OSI Model


The OSI reference model in computer network teaching is a lot like an onion. It has several layers, each standing alone but each needing to be peeled back to get a result. But where peeling back the layers of an onion gets you a tasty ingredient or treat, peeling them back in the OSI model delivers a better understanding of networking and the protocols that lie behind it.


Each of these seven layers serves a different function.


Layer 1: Physical Layer


Sitting at the lowest level of the OSI model, the physical layer is all about the hows and wherefores of transmitting electrical signals from one device to another. Think of it as the protocols needed for the pins, cables, voltages, and every other component of a physical device if said device wants to communicate with another that uses the OSI model.


Layer 2: Data Link Layer


With the physical layer in place, the challenge shifts to transmitting data between devices. The data layer defines how node-to-node transfer occurs, allowing for the packaging of data into “frames” and the correction of errors that may happen in the physical layer.


The data layer has two “sub-layers” of its own:


  • MAC – Media Access Controls that offer multiplexing and flow control to govern a device’s transmissions over an OSI network.
  • LLC – Logical Link Controls that offer error control over the physical media (i.e., the devices) used to transmit data across a connection.

Layer 3: Network Layer


The network layer is like an intermediary between devices, as it accepts “frames” from the data layer and sends them on their way to their intended destination. Think of this layer as the postal service of the OSI model in computer network applications.



Layer 4: Transport Layer


If the network layer is a delivery person, the transport layer is the van that the delivery person uses to carry their parcels (i.e., data packets) between addresses. This layer regulates the sequencing, sizing, and transferring of data between hosts and systems. TCP (Transmission Control Protocol) is a good example of a transport layer in practical applications.


Layer 5: Session Layer


When one device wants to communicate with another, it sets up a “session” in which the communication takes place, similar to how your boss may schedule a meeting with you when they want to talk. The session layer regulates how the connections between machines are set up and managed, in addition to providing authorization controls to ensure no unwanted devices can interrupt or “listen in” on the session.


Layer 6: Presentation Layer


Presentation matters when sending data from one system to another. The presentation layer “pretties up” data by formatting and translating it into a syntax that the recipient’s application accepts. Encryption and decryption is a perfect example, as a data packet can be encrypted to be unreadable to anybody who intercepts it, only to be decrypted via the presentation layer so the intended recipient can see what the data packet contains.


Layer 7: Application Layer


The application layer is a front end through which the end user can interact with everything that’s going on behind the scenes in the network. It’s usually a piece of software that puts a user-friendly face on a network. For instance, the Google Chrome web browser is an application layer for the entire network of connections that make up the internet.


Interactions Between OSI Layers


Though each of the OSI layers in computer networks is independent (lending to the flexibility mentioned earlier), they must also interact with one another to make the network functional.


We see this most obviously in the data encapsulation and de-encapsulation that occurs in the model. Encapsulation is the process of adding information to a data packet as it travels, with de-encapsulation being the method used to remove that data added data so the end user can read what was originally sent. The previously mentioned encryption and decryption of data is a good example.


That process of encapsulation and de-encapsulation defines how the OSI model works. Each layer adds its own little “flavor” to the transmitted data packet, with each subsequent layer either adding something new or de-encapsulating something previously added so it can read the data. Each of these additions and subtractions is governed by the protocols set within each layer. A perfect network can only exist if these protocols properly govern data transmission, allowing for communication between each layer.


Real-World Applications of the OSI Model


There’s a reason why the OSI model in computer network study is often called a “reference” model – though important, it was quickly replaced with other models. As a result, you’ll rarely see the OSI model used as a way to connect devices, with TCP/IP being far more popular. Still, there are several practical applications for the OSI model.


Network Troubleshooting and Diagnostics


Given that some modern computer networks are unfathomably complex, picking out a single error that messes up the whole communication process can feel like navigating a minefield. Every wrong step causes something else to blow up, leading to more problems than you solve. The OSI model’s layered approach offers a way to break down the different aspects of a network to make it easier to identify problems.


Network Design and Implementation


Though the OSI model has few practical purposes, as a theoretical model it’s often seen as the basis for all networking concepts that came after. That makes it an ideal teaching tool for showcasing how networks are designed and implemented. Some even refer to the model when creating networks using other models, with the layered approach helping understand complex networks.


Enhancing Network Security


The concept of encapsulation and de-encapsulation comes to the fore again here (remember – encryption), as this concept shows us that it’s dangerous to allow a data packet to move through a network with no interactions. The OSI model shows how altering that packet as it goes on its journey makes it easier to protect data from unwanted eyes.



Limitations and Criticisms of the OSI Model


Despite its many uses as a teaching tool, the OSI model in computer network has limitations that are the reasons why it sees few practical applications:


  • Complexity – As valuable as the layered approach may be to teaching networks, it’s often too complex to execute in practice.
  • Overlap – The very flexibility that makes OSI great for people who want more control over their networks can come back to bite the model. The failure to implement proper controls and protocols can lead to overlap, as can the layered approach itself. Each of the computer network layers needs the others to work.
  • The Existence of Alternatives – The OSI model walked so other models could run, establishing many fundamental networking concepts that other models executed better in practical terms. Again, the massive network known as the internet is a great example, as it uses the TCP/IP model to reduce complexity and more effectively transmit data.

Use the OSI Reference Model in Computer Network Applications


Though it has little practical application in today’s world, the OSI model in computer network terms is a theoretical model that played a crucial role in establishing many of the “rules” of networking still used today. Its importance is still recognized by the fact that many computing courses use the OSI model to teach the fundamentals of networks.


Think of learning about the OSI model as being similar to laying the foundations for a house. You’ll get to grips with the basic concepts of how networks work, allowing you to build up your knowledge by incorporating both current networking technology and future advancements to become a networking specialist.

Related posts

Sage: The ethics of AI: how to ensure your firm is fair and transparent
OPIT - Open Institute of Technology
OPIT - Open Institute of Technology
Mar 7, 2025 3 min read

Source:


By Chris Torney

Artificial intelligence (AI) and machine learning have the potential to offer significant benefits and opportunities to businesses, from greater efficiency and productivity to transformational insights into customer behaviour and business performance. But it is vital that firms take into account a number of ethical considerations when incorporating this technology into their business operations. 

The adoption of AI is still in its infancy and, in many countries, there are few clear rules governing how companies should utilise the technology. However, experts say that firms of all sizes, from small and medium-sized businesses (SMBs) to international corporations, need to ensure their implementation of AI-based solutions is as fair and transparent as possible. Failure to do so can harm relationships with customers and employees, and risks causing serious reputational damage as well as loss of trust.

What are the main ethical considerations around AI?

According to Pierluigi Casale, professor in AI at the Open Institute of Technology, the adoption of AI brings serious ethical considerations that have the potential to affect employees, customers and suppliers. “Fairness, transparency, privacy, accountability, and workforce impact are at the core of these challenges,” Casale explains. “Bias remains one of AI’s biggest risks: models trained on historical data can reinforce discrimination, and this can influence hiring, lending and decision-making.”

Part of the problem, he adds, is that many AI systems operate as ‘black boxes’, which makes their decision-making process hard to understand or interpret. “Without clear explanations, customers may struggle to trust AI-driven services; for example, employees may feel unfairly assessed when AI is used for performance reviews.”

Casale points out that data privacy is another major concern. “AI relies on vast datasets, increasing the risk of breaches or misuse,” he says. “All companies operating in Europe must comply with regulations such as GDPR and the AI Act, ensuring responsible data handling to protect customers and employees.”

A third significant ethical consideration is the potential impact of AI and automation on current workforces. Businesses may need to think about their responsibilities in terms of employees who are displaced by technology, for example by introducing training programmes that will help them make the transition into new roles.

Olivia Gambelin, an AI ethicist and the founder of advisory network Ethical Intelligence, says the AI-related ethical considerations are likely to be specific to each business and the way it plans to use the technology. “It really does depend on the context,” she explains. “You’re not going to find a magical checklist of five things to consider on Google: you actually have to do the work, to understand what you are building.”

This means business leaders need to work out how their organisation’s use of AI is going to impact the people – the customers and employees – that come into contact with it, Gambelin says. “Being an AI-enabled company means nothing if your employees are unhappy and fearful of their jobs, and being an AI-enabled service provider means nothing if it’s not actually connecting with your customers.”

Read the full article below:

Read the article
Reuters: EFG Watch: DeepSeek poses deep questions about how AI will develop
OPIT - Open Institute of Technology
OPIT - Open Institute of Technology
Feb 10, 2025 4 min read

Source:

  • Reuters, Published on February 10th, 2025.

By Mike Scott

Summary

  • DeepSeek challenges assumptions about AI market and raises new ESG and investment risks
  • Efficiency gains significant – similar results being achieved with less computing power
  • Disruption fuels doubts over Big Tech’s long-term AI leadership and market valuations
  • China’s lean AI model also casts doubt on costly U.S.-backed Stargate project
  • Analysts see DeepSeek as a counter to U.S. tariffs, intensifying geopolitical tensions

February 10 – The launch by Chinese company DeepSeek, opens new tab of its R1 reasoning model last month caused chaos in U.S. markets. At the same time, it shone a spotlight on a host of new risks and challenged market assumptions about how AI will develop.

The shock has since been overshadowed by President Trump’s tariff wars, opens new tab, but DeepSeek is set to have lasting and significant implications, observers say. It is also a timely reminder of why companies and investors need to consider ESG risks, and other factors such as geopolitics, in their investment strategies.

“The DeepSeek saga is a fascinating inflection point in AI’s trajectory, raising ESG questions that extend beyond energy and market concentration,” Peter Huang, co-founder of Openware AI, said in an emailed response to questions.

DeepSeek put the cat among the pigeons by announcing that it had developed its model for around $6 million, a thousandth of the cost of some other AI models, while also using far fewer chips and much less energy.

Camden Woollven, group head of AI product marketing at IT governance and compliance group GRC International, said in an email that “smaller companies and developers who couldn’t compete before can now get in the game …. It’s like we’re seeing a democratisation of AI development. And the efficiency gains are significant as they’re achieving similar results with much less computing power, which has huge implications for both costs and environmental impact.”

The impact on AI stocks and companies associated with the sector was severe. Chipmaker Nvidia lost almost $600 billion in market capitalisation after the DeepSeek announcement on fears that demand for its chips would be lower, but there was also a 20-30% drop in some energy stocks, said Stephen Deadman, UK associate partner at consultancy Sia.

As Reuters reported, power producers were among the biggest winners in the S&P 500 last year, buoyed by expectations of ballooning demand from data centres to scale artificial intelligence technologies, yet they saw the biggest-ever one-day drops after the DeepSeek announcement.

One reason for the massive sell-off was the timing – no-one was expecting such a breakthrough, nor for it to come from China. But DeepSeek also upended the prevailing narrative of how AI would develop, and who the winners would be.

Tom Vazdar, professor of cybersecurity and AI at Open Institute of Technology (OPIT), pointed out in an email that it called into question the premise behind the Stargate Project,, opens new tab a $500 billion joint venture by OpenAI, SoftBank and Oracle to build AI infrastructure in the U.S., which was announced with great fanfare by Donald Trump just days before DeepSeek’s announcement.

“Stargate has been premised on the notion that breakthroughs in AI require massive compute and expensive, proprietary infrastructure,” Vazdar said in an email.

There are also dangers in markets being dominated by such a small group of tech companies. As Abbie Llewellyn-Waters, Investment manager at Jupiter Asset Management, pointed out in a research note, the “Magnificent Seven” tech stocks had accounted for nearly 60% of the index’s gains over the previous two years. The group of mega-caps comprised more than a third of the S&P 500’s total value in December 2024.

Read the full article below:

Read the article