Why Context is Crucial to Successful Edge Computing
readwrite.com – 2021-07-21 18:00:16 – Source link
The very nature of technology innovation lends itself to the types of buzzwords and jargon that can often impede people’s understanding of the technologies themselves. These buzzwords range from metaphorical, but ultimately easy-to-understand, terms like “cloud;” to downright literal terms like “Internet of things.” Somewhere in between is where we get terms like “edge computing,” which is where the technology itself and the term used to describe it has one essential thing in common – they require context.
Why Context is Crucial to Successful Edge Computing
In IT, we call it a “use case.” Still, that term is essentially a tangible manifestation of the context in which technology will be most effective, whether that’s a manufacturing scenario, a telematics platform, or an IoT integration. Even within IoT, context is crucial because it can be used in something as simple as a smart thermostat, something as advanced as an MRI machine, or any number of use cases in between.
The real challenge when it comes to edge computing isn’t so much to create a device, but rather to make sure that device can operate and transmit data reliably.
People focus on the platform side of the business all too often because that’s where they’re going to see ROI on the data and the analytics. But, still, if they don’t have the right things going on at the network edge, then all of that wonderful back-end processing isn’t going to amount to much.
Edge computing tends to be overlooked
Edge computing tends to be overlooked because most people simply take it for granted. This happens a lot during the manufacturing process especially, because there’s a mindset that when you buy a device like a laptop or a smartphone, that device is going to communicate with other devices through an interface that’s driven by the user.
We are thinking — “use the smartphone to send data to the laptop, and then use the laptop to send the same data to the printer.”
In the context of IoT devices, that’s not really how things work.
Without proper edge management, maintenance costs can quickly skyrocket for a device that’s meant to be self-sustaining. And we’re not just talking about rolling trucks to troubleshoot a router. In some cases, these devices are literally designed to be buried in the ground alongside crops to measure soil moisture.
I0T is a small footprint device meant to exist and operate on its own
In the IoT realm, we’re building these new, small-footprint devices that are meant to exist and operate on their own. The initial interactions we’re having with most of our customers and business partners center on the question of, “How do we connect to this thing? How do we deal with this protocol? How do we support this sensor?”
Some of the biggest challenges arise when we get down to the electronics level and start figuring out how to interface from the electronics up into the first level of the software tier.
Communication
In the world of IoT, devices are built with some form of communication standard in mind. However, remembering that the actual data that they transfer – and how they transfer it – is another piece of the puzzle altogether. In addition, the devices have to be maintained for the entire lifespan of the device.
Maybe the temperature went up, or the temperature went down, or the device is just periodically meant to pulse some information back into the network to do something.
Most of the time, people are challenged with designing these things, and it might be the first time they’ve ever been challenged with worrying about the issues. People forget it’s not plug-and-play, like a laptop or printer.
Modern cellular devices consume data
Even something as simple as the data itself – and understanding how modern cellular devices consume data compared to their Wi-Fi and 3G counterparts – can derail an entire IoT project before it even gets off the ground. It’s a lot more challenging world to deal with.
Is the device properly scaled and calibrated?
Another key area of that world involves being able to make sure that devices are properly scaled and calibrated, and that the data they transmit is handled in a meaningful way. For example, if something goes wrong with the connection, that data needs to be properly queued so that, when the connection is reestablished, it can still end up where it was meant to go.
Many otherwise very successful companies have learned these types of lessons the hard way by not taking into account how their devices would behave in the real world. For instance, they might be testing those devices in a lab when they’re ultimately designed to use cellular data. The cost of that critical communication function ends up being so high that the device isn’t a viable product from a business standpoint.
What is the first job or function of the device — will it work as intended?
Of course, it can be even more disastrous when developers focus too much on how the device will work before they’ve put enough time into figuring out whether the physical device itself is going to work in the first place.
Whether it’s some kind of simple telematics device for a vehicle, an advanced module for use in manufacturing, or any number of devices in between, the all-important work of making sure that a given device and its components will work the way it’s intended is often relegated to the people with the least experience.
Appreciate the complexity
In many cases, people get thrown into it, and they don’t appreciate the complexity they’re dealing with until they’ve already suffered any number of setbacks. It could be an environmental issue, a problem with battery life, or even something as simple as where an antenna needs to be placed. Then, once it’s been placed in the field, how will it be updated?
Is the item or device really ready to be shipped? Test, test test.
When these types of devices fail after already being placed in the field, the cost of replacing and reshipping them alone can completely torpedo the entire product line. That’s why it’s so important to test them in the field in smaller groups and avoid being seduced by the garden path of scaling them up too quickly.
Grand plans are great, but starting small and iterating over time is the ultimate case where an ounce of prevention is truly worth more than a pound of cure.
Delivering to the customer — the “last mile.” But think “first mile first.”
People often talk about edge computing as a “Last mile” technology, and as the last mile of a marathon, it is the most challenging of all.
Historically, large telecom and IT companies describe the connection to a device or the edge as the “Last Mile,” as in delivering data services from the curb to the house.
But that is an incorrect viewpoint in IoT. Everything starts with the device — the originator of the data. Therefore, connecting to the device and delivering data to the application infrastructure is crossing the “First Mile.”
Either way, once we have the proper understanding and context of how edge computing functions in the real world, the finish line is already in sight.
Image Credit: valdemaras d.; pexels; thank you!