How do we define edge computing? Picture yourself as the chef of a big restaurant in your city. You have hired 16 junior chefs. But they are new to restaurant cooking and are so low on confidence. For each step in each recipe, they run to you for approval. One junior chef comes to you with a spoonful of soup every time he adds some spice to verify if it still tastes right. Another chef rushes to you every time she adds some sugar to cake batter. Not only are you overworked as 16 chefs rush to you all day for validation, but the chefs themselves have run themselves ragged between their counters and your counter.
What if instead you told them, “I have faith in all of you. I hired you because you are all good learners with promise of greatness. I provide you complete autonomy in your day-to-day cooking. I will provide you with the recipes. But while cooking, you be the judge of the taste, texture and temperature. You don’t need my validation for every step. Come to me only if a customer complains about a dish. I will then intervene and take care of the customer. And help you improve.”
You have just established the groundwork for edge computing within your restaurant, where each junior chef uses his / her own expertise to complete a dish. You will just oversee their learning and guide them when they go wrong.