The Evolution and Benefits of IoT, SDN and Edge Computing

We live in exciting times where the digital innovations materialized by EDGE Computing, SDN, and IoT applications translate into significant business use cases. By 2025, the Edge Computing market is expected to be worth $3.24 billion, with 26 billion smart devices generating at least 500 ZB of data a year.

Everything is being transformed by these technologies ' high-powered computing and networking capabilities, from the administrative processes in factories to the power grids in Smart Cities.

Technology experts and evangelists often pose the question, “Which technology is actually driving industry?”

However, from an operational viewpoint, these technologies are mutually exclusive while simultaneously relying on each other to deliver a high-scale and powerful performance.

How IoT, EDGE Computing, & SDN are Connected

EDGE Computing provides an open platform to perform management, analysis, control, and data processing tasks at IoT's network edge. This meets the connection, computing, storage, and application installation needs of things that sense conditions and control actions.

Stu Bailey, the founder and CTO of Infoblox, says, “The Internet of Things is a major driver for SDN. If you have a lot of things, then the most important inhibitor is complexity. The only material that we have to combat an increasing complexity in IT systems is software. There won’t be an Internet of Things without software-defined networks.”

SDN cost-effectively virtualizes IoT networks to enable automatic traffic rerouting, device reconfiguration, and bandwidth allocation to boost performance and reduce complexity. Some clear benefits are greater network transparency through automated security threat detection, security policies application, and access control. SDN promotes the centralized management of sensors, terminals, communication modules, IoT gateways, and other devices while supporting automatic deployment, security authentication, status monitoring, and remote upgrades.

This image has an empty alt attribute; its file name is IOT1.png
Image Inspired by Carleton.ca 

Why IoT, SDN, and EDGE Computing Need Each Other

In the current technological scenario, we witness a tremendous increase in heterogeneous devices' connections and the need to control them remotely using reliable IoT. As historically evidenced, devices that hold vital data connected to a network are vulnerable to hacking and illegal monitoring. Thus, these devices' security becomes paramount, especially when IoT is turning into “super-heterogeneous networks” in a complex environment.

The exponential growth of internet-connected devices in the past, along with the increasing need for real-time computing, continues to drive edge-computing systems. The following growth trend from Business Insider shows how IoT (estimated number of connected devices) has benefited from Edge Technology Solutions in the past.

This image has an empty alt attribute; its file name is IOT2.png
Image Source: Business Insider

Edge computing systems accelerate the creation or support of real-time applications on the premise of rapid computing capabilities, such as video processing and analytics, self-driving cars, artificial intelligence, robotics, and more. The following image depicts how the SDN market will grow over the next 5 years due to increased network traffic and complex data streaming.

This image has an empty alt attribute; its file name is IOT3-1.png
Image Source: Markets & Markets

Thus, IoT, SDN, and Edge Computing form important points of the influential technology triangle that drive modern digital transformations.

The key advantages of these three technologies determine how they complement each other.

IoT SDN Edge Computing
Real-time access to data and
information sitting on remote
network devices.
Streamlined enterprise management and provisioning resulting from centralized network provisioning. Reduced network latency by having compute, storage, and network resources closer to the application.
Transparent and efficient communication of network devices to produce faster and accurate results. SDN offers a central point of control to distribute security and policy information consistently throughout the enterprise. Application Scalability through decentralized compute and storage resources.
Process automation based on
business logic to reduce errors,
human intervention,
turnaround time, and costs.
SDN promotes lower costs through administrative efficiency, improvements in server utilization, and better control of virtualization. Distributed networks that need reduced bandwidth leading to reduced costs.
Enhanced & continuous data
collection through data
streaming mechanisms.
Cloud abstraction and unified resource management are made easy through SDN controllers. Distributed networks that offer enhanced security and data protection to data stored in different locations.

These advantages allow these technologies to be a significant part of different industries’ IoT ecosystems. Read more about individual IoT use cases in our following blog.

Historical Milestones of IoT, SDN, and Edge Computing

The history of SDN, IoT, and Edge computing has a bearing on implementation in modern-day applications.

SDN

SDN started in 1878 when the first commercial telephone exchange was installed. Then in 1891, Almon Strowger invented a telephone dial, where data (voice) and control (dial pulses) were transmitted over the same insecure channel (telephone wire) to the phone exchange system.

Later in 1963, Bell labs introduced “Touch Tone” to its customers, replacing the mechanical rotary dial system with DTMF tones. However, the DTMF signals shared the same channel as the data. In early 1980, the data (voice) and control (DTMF) were separated by introducing Central Network Control points. This allowed for new services such as Alternative Billing Services (ABS), Private Virtual Networks (PVN), SMS, Follow Me, 800 numbers, Calling Cards, etc.

In the ’90s, the active network was introduced. An active network is a network in which the nodes are programmed to perform custom operations on the messages that pass through the node. Later in the ’90s, Network Virtualization allowed rapid deployment of new applications where SDN separated the control and data planes to enable centralized control, allow automation, and create a programmable network. NFV virtualizes the components of the network, and SDN centralizes the control of those components.

This image has an empty alt attribute; its file name is IOT4.png

Industrial Internet of Things

IIoT started in the late 1960s when Dick Morley introduced Programmable Logic Controllers (PLC) to General Motors for their automatic transmission manufacturing division. The rise of “ubiquitous computing” (pervasive computing, often considered the successor to mobile computing) in the 1970's caused wireless communication and networking technologies, mobile devices, embedded systems, wearable computers, radio frequency ID (RFID) tags, middleware, and software agents to come about. Internet capabilities, voice recognition, and artificial intelligence (AI) are often also included in this list.

This image has an empty alt attribute; its file name is IOT5.png
Image Source: ems-summit.com

Ubiquitous computing was pioneered at the Olivetti Research Laboratory in Cambridge. Active Badge, a "clip-on computer" the size of an employee ID card, was created to let the company track people’s location in a building, as well as the objects to which they were attached.

With the introduction of Ethernet in 1980, people began to explore the concept of a network of smart devices. A modified Coke machine at Carnegie Mellon University became the first internet-connected appliance.

The actual term "Internet of Things" was coined by Kevin Ashton in 1999 while presenting a new exciting technology called RFID. Combining this and the trending internet somehow made sense, to be called the "Internet of Things."

The IoT's potential effect on the global economy led Chinese leaders to designate IoT as a priority area for development in 2009. China subsequently took steps to catalyze domestic IoT research and development (R&D) and infrastructure development through robust planning initiatives and extensive financial support, which led to steep growth in its GDP in the past nine years. This makes China the largest consumer of IoT technology currently.

This image has an empty alt attribute; its file name is IOT6.png
Image Source: Marketing China

In 2012, Gartner included a newly emerging phenomenon on their Hype cycle of emerging technologies, “The Internet of Things.” Around that time, popular tech-focused magazines like Forbes, Fast Company, and Wired starting used IoT around their vocabulary to promote technological innovation and the newest trend in the interconnected world.

Gartner Hype Cycle 2012

This image has an empty alt attribute; its file name is IOT7.png
Image Source: wired.com

Edge Computing

Edge computing means different things for different industries. For a manufacturer, “Edge Computing” means that the data is processed before it crosses any wide area network (WAN). Therefore, it is NOT processed in a traditional data center, whether on a private or public cloud.

Edge computing is the latest term for decentralized computing. A distributed computing paradigm brings computation and data storage closer to the location needed to improve response times and saves bandwidth.

The underlying technology behind Edge has been evolving for decades.

Edge computing can be traced back to 1990, when the first Content Delivery Network (CDN) originated from Akamai. This network delivered cached images and videos using distributed servers located closer to the end-user. In 1997, Pervasive Computing (also known as Ubiquitous Computing), the technology behind IoT, started to appear, which offloaded excessive resource-consuming applications to local servers.

In the early 2000s, the overlay Peer-to-Peer (P2P) networks leveraged proximity routing to avoid slow downloads over long-distance servers. Around 2006, the first public cloud computing with Amazon’s Elastic Compute Cloud (EC2) was introduced where computing and storage resources could be easily rented out. In 2009, Cloudlet, a mobility-enhanced mini cloud data center to support resource-intensive mobile applications, was introduced. In 2010, Cisco introduced Fog Computing, where a distributed cloud uses intelligent edge nodes to perform a large amount of computation, storage, and communication.

This image has an empty alt attribute; its file name is IOT8.png
Image Source: Lanner.com

These technologies can accomplish greater digital transformations individually and in the right combination.

Connect with our experts to implement your business use cases with the right combination of these technologies.


Reducing Risk and Improving Agility with Test Automation

 

Methodical software development processes can unlock hidden value in your digital transformation projects. A well-implemented Agile framework that incorporates adaptive, evidence-based planning, iterative delivery, and automation can reduce risk, shorten release cycles, and diminish workforce disruption. Effective test automation, including strategies such as Test-Driven Development, Behavior-Driven Development, and Acceptance Test-Driven Development, can increase predictability but also add their own unique challenges.

Software tests must be repeated frequently during development cycles; each time source code is modified, software tests should be repeated. For each release, the software should be tested on all supported operating systems and hardware configurations. Manually repeating these tests can be costly and time-consuming. Automated tests, in contrast, can be executed repeatedly at no additional cost, thus creating efficiency and reducing testing time. Automated testing can reduce the time required to run repetitive tests from days to hours, translating the effort directly into cost savings.

In this article, I’ll address some of the common obstacles encountered in Test-Driven Development (TDD), Behavior-Driven Development (BDD), and Acceptance Test-Driven Development (ATDD) transformations.

First, let’s define each approach.

TDD  BDD ATDD
Who Developer Developer, QA,
Product Owner
Developer, QA,
Product Owner
Purpose Unit Test Understanding
Requirements
Automated
Acceptance test
Notation Code (software
language)
Structured plain
English (Gherkin)
Structured plain
English (Gherkin)
Definition TDD is a
development
technique that
focuses more on the implementation of
a feature
BDD is a
development
technique that
focuses on the
system’s behavior
ATDD is a technique
like BDD with
emphasis on
capturing the
requirements
Process framework Agile Agile Agile

Let’s dive deeper into the respective methods.

Test-Driven Development  

This image has an empty alt attribute; its file name is Screen-Shot-2020-04-01-at-1.10.30-PM.png

Test-Driven Development has 3 distinct phases:

  1. Develop test (Test should fail)
  2. Develop code (Test should pass)
  3. Refactor code (Test should pass)

The introduction of TDD requires a change in philosophy and synchronized cooperation between development and test teams. Developers should write the test, code, and refractor while the test team reviews the test-cases for completeness and coverage.

Although TDD is primarily a programmer’s job, testers can collaborate by sharing the test scenarios consisting of:

  • Boundary value cases
  • Equivalence class test cases
  • Critical business cases
  • Cases of the error-prone functionalities
  • Securing level cases

Testers should also participate in defining unit test scenarios and provide their feedback on the test results. TDD offers benefits but has several drawbacks as well:

The Good The Bad
Promotes testable software Hard to apply on legacy code
Refactor promotes better written
software solutions
Requires maintenance
Increases code coverage It takes time and effort to learn
Build confidence Restrict to unit testing only
Forces developers to think before
implementation
It helps to understand the code by
providing examples
Defines “Done”

Behavior Driven Development 

This image has an empty alt attribute; its file name is Screen-Shot-2020-04-01-at-1.10.47-PM.png

There are 4 work-flow phases defined in BDD:

  1. Discovery of use-case (defining the test)
  2. Develop test (Test should fail)
  3. Develop code (Test should pass)
  4. Refactor code (Test should pass)

As you can see, BDD is TDD with an extra step: discovery of the use-case. First, you need to define a user story. The story should follow a common agile pattern:

  • As a: the person or role who will benefit from the feature;
  • I want the feature;
  • so that: the benefit or value of the feature.

Next, the business analysts, developers, and testers must determine different use-cases and the acceptance criteria for each of the scenarios.

Each scenario acceptance criteria follows this structure. They are called “steps”:

  • Given: the initial context at the beginning of the scenario, in one or more clauses;
  • when: the event that triggers the scenario;
  • then: the expected outcome, in one clause;
  • And: the expected outcome, in another clause (Optional).
  • But: the exception in the clause (not always supported by the framework).

BDD encourages conversations across technical disciplines and stakeholders. Critically, this approach helps ensure that teams share a clear understanding of the requirements, which is necessary to work towards a common goal seamlessly. These conversations provide concrete, real-life examples of product behavior; these examples can more explicitly define what needs to be delivered.

Writing quality scenarios requires some practice; it will take time. Start simple, then expand the scenarios with more details. My advice is to keep scenarios simple, with a maximum of 10-15 steps.

The Good The Bad
Promotes collaboration It takes time and effort to master
Removes ambiguousness Time-consuming when applying to
legacy code
Defines “Done”

Acceptance Test-Driven Development  

This image has an empty alt attribute; its file name is Screen-Shot-2020-04-01-at-1.11.13-PM.png

There are 5 work-flow phases defined in ATDD:

  1. Discovery of use-case (defining the test)
  2. Develop Acceptance test (Test should fail)
  3. Develop test (Test should fail)
  4. Develop code (Test should pass)
  5. Refactor code (Test should pass)

Teams often find that the act of defining acceptance tests during requirement discussions improves their understanding of the requirements themselves. ATDD tests force an agreement on a software's exact behavior and result in a more testable software overall. Further, automated regression tests can rapidly provide valuable feedback regarding business-facing expectations.

Conclusion

TDD (Test Driven Development), BDD (Behavioral Driven Development), and ATDD (Acceptance Test Driven Development) are increasingly popular strategies within agile teams. It’s important to understand the differences between these three test methodologies before starting your transformation so that you can select the most appropriate method based on your objectives.