There are a lot of things in the Core Solutions and Management Tools:
1 | Azure IoT Hub
IoT Hub acts like a command-&-Control because you can send messages to devices. That is actually called cloud-to-device, or C2D.
D2C is tthe opposite of that, it’s device to cloud.
Messages between cloud and device are encrypted
Use IoT Hub to send firmware updates and other files down to the devices.
Device Twins – Every physical IoT device has a JSON equivalent in the IoT Hub. Each device twin can contain metadata about the device. The metadata can be Tags.
You can take actions based on Tags. You can do things such as control firmware update rollouts, by only updating the IoT devices in a certain location or environment, testing, then continuing 1 group at a time in case things go sideways.
That is possible because the device twin maintains 2 copies of each property:
- Reported property
- Desired property
If you want to change a setting on the IoT device, you set its desired property. Then the next time the device calls home, it picks up the desired property and sets the value on the device.
To deploy a shit load of IoT devices, you can use the IoT Hub DPS (Device Provisioning Service).
DPS uses enrollment groups to add devices to your IoT Hub. Each device is uniquely ID’ed by a certificate or Trusted Platform Module (TPM) chip. on the device Once the IoT device wakes up and calls out, the enrollment group tells the device which IoT Hub to communicate with.
As the IoT devices send messages to IoT Hub, you can route them to other services like, Event Hub or just store them in Azure Storage.
There are 2 pricing tiers:
Choose the lowest possible tier and edition when you start, because you can’t scale down. You can only scale up! You can go from Basic to Standard, not not the reverse. You can go from S1 to S2, but not the reverse. However, I wonder if you can go from B1 to Standard Free edition?!
Features that are only available in the Standard Tier:
- Near realtime streaming
- C2D messaging
- Device management, device twin, and module twin
- IoT Edge for handling IoT devices at the network edge
- If using DPS, there’s another charge of $0.123 per 1,000 operations.
2 | IoT Central
is a SaaS offering for monitoring IoT devices without having to do complex configuration. Go to https://apps.azureiotcentral.com and create an app.
You can add simulated devices in order to test some shit out. Can’t do that in IoT Hub.
IoT Central has 3 built-in roles. But you can create your own custom roles also:
- Administrator – Full access to everything
- Builder – Edit pages
- Operator – Use the application
IoT Central uses Rules that get triggered when you are monitoring devices. When a rule gets triggered there are endless options available to you, IoT Central can send a detailed email, trigger a webhook that calls an Azure Function, etc.
3 | Azure Sphere
Azure Sphere is an ecosystem designed to be a secure environment starting on the chip in IoT and smart devices. Microsoft developed the Azure Sphere MCU chip which has security built into it. MCU stands for Microprocessor Unit, but I’m confused as to why MCU was picked and not MPU, but whatever…
The MCU’s operating system is Azure Sphere OS, which is Linux based and is customized for Azure Sphere. And a huge benefit of this is the ability to patch the chip. No more perpetually vulnerable devices never being patched.
Third parties can run code to fit their needs for there particular application.
To use an MCU, you purchase it from a Microsoft distributor. You get:
- Azure Sphere certified MCU (MediaTek MT3620 AN) = $8.65 each
- A license for the Azure Sphere Security Service. Updates through July 2031
- A license for the Azure Sphere OS
4 | Azure Synapse Analytics
Azure Synapse is basically SQL Data Warehouse with a new wrapper and some added stank on it. It handles Big Data queries
Azure Synapse run in an Azure Synapse Cluster. Each cluster has 4 things in it:
- Synapse SQL
- the data warehouse
- Apache Spark integration
- Data integration of Spark and Azure Data Lake Storage
- Azure Synapse Studio (web UI)
Run queries against the big data on the Compute Nodes (queries run in parallel for speed). The Compute Nodes run a component called the Data Movement Service (DMS) that moves the data efficiently between the Compute Nodes.
Big data nerds typically use Apache Spark, which is a big data processing engine. Azure Synapse tightly integrates with Spark. Azure Synapse uses Azure Data Lake Storage. In a data lake, related data is stored in containers .
5 | HDInsight
An HDInsight cluster performs analytics after breaking up the large data blocks into segments. Those segments are sent to nodes in the cluster so that they can do the work in parallel and produce result sets.
Think HaDoop here, since it’s Microsoft’s cloud implementation of it. It makes it possible to manage. clusters of computers for distributed big data processing. BUT IT’S NOT JUST HADOOP!
HDInsight supports these Cluster Types as well:
- Hive – for using SQL like queries
- Pig – for using scripting language
- Oozie – for workflow scheduling
- Extremely fast NoSQL database
- Unbounded streams of data in real-time.
- in-memory cache across parallel operations
- Interactive Query
- In-memory analytics using Hive and LLAP, which has processes that execute fragments of Hive queries.
- R Server
- Big data analytics using R.
- synchronous data streams, which could often be IoT devices
6 | Azure Databricks
Azure Databricks takes raw unstructured data and performs data modeling it, so it’s optimized for Machine Learning Models.
Databricks is the name of the company that developed Apache Spark. Now they operate their own platform, called Databricks. Azure Databricks was built natively by Microsoft to take advantage of integrations with Azure.
Use the Databricks workspace in Azure to interact with your data.
Databricks does all its work using clusters. Databricks also uses a serverless model. When you run a job Azure allocates VMs to the cluster to process that job, then kills them off when processing is complete.
Notebooks are a way to present and interact with related data. Each Notebook contains data as well as visualizations to represent the data.
Databricks paths for queries start with
The Databricks Runtime ML includes popular libraries for third-party Machine Learning frameworks:
it is also possible to use distributed deep-learning algorithms like Horovod.
Once your done with your machine learning Model in Databricks, you productionalize it by export it for use in an external machine learning system. You have 2 methods of productionalizing:
- Databricks ML Model Export
7 | Azure Machine Learning
Artificial Intelligence done here.
Azure Machine Learning offers SDKs for Python and R.
Has 2 editions:
- Basic Edition – only access to ML SDKs and Notebooks
- Enterprise Edition – adds more features including the visual designer
Pricing based on usage of the VMs where your Azure Machine Learning assets are running. Plus you have to pay a surcharge for doing machine learning. Plus a small amount per hour of usage. Microsoft says you can save money if you reserve your usages for 1 or 3 years. Fuck that!
8 | Cognitive Services
SaaS ML Models that you can use in your ML solutions.
Computer Vision is an interesting API that makes it easy to build a Machine Learning engine that can extract info from images. Not just for people in the image but also objects.
Video Indexer API can do the same thing but with videos
Speech APIs do translations in real-time.
Pricing is based on transactions.
9 | Azure Bot Service
Think online Chat bots for support. These are AI-driven interactions between humans and this fucking thing.