Automating using function apps in a scalable and sustainable way often requires a completely different way of working, which comes with significant time investments and steep learning curves. So, is it actually worth the effort?
Most people who are building function apps in .net, Python, or PowerShell (or Azure Functions while using PowerShell) will recognize that if a solution starts growing, it becomes very difficult to keep track of how it’s structured and what’s in it.
Initially, development – without the use of standardization and building blocks - is faster and still easy to understand – the whole reason to use function apps in the first place.
However, it becomes increasingly difficult when a solution grows bigger, and you will face:
What we see at our customers is that there is often little insight into how to do it differently. Developers are used to a certain way of working, and they build on top of the complexity that they or others have created. Completely changing the way of working requires a significant time investment - and a steep learning curve.
One of our manufacturing customers asked us to help with this challenge.
Their environment (see abstracted visual below) contained multiple functions that were sharing the same or similar business logic, without re-using it. In addition, there was no common ground for data structuring. This made it very hard to expand (not scalable).
To increase flexibility and scalability in this central automation project for our customer, we’ve created a solution containing five main pillars (or ‘functionality sets’).
We’ve created modules, built the logic required to do a certain action (e.g. connect to a database), and clearly defined which inputs are required to perform the action. This made sure that business logic remains to each and every individually module which can easily be shipped and re-used.
Additionally, we defined and implemented strict coding guidelines. All modules created must be according to these standards. This makes it easier readable and understandable.
And, by defining code guidelines, we safeguard the quality of the code.
Behind the logic, we also create UI’s or command interactions between our automation framework and whatever is behind it. This way, functions can be used and called from anywhere throughout any CLI or UI which supports HTTP Calls.
This also adds the added benefit that business logic and data can be visualized to non-tech users.
We defined the required format for the data, so what data has to look like (also further in the automation chain). This way, we always know that if we’re automating certain operations in cloud or stack or anywhere else in the field, data and models are always standardized, enhancing shareability and common understanding.
Finally, to actively promote reusability of components, we’re offering them in a central marketplace which allows easy access and reusability. Guidelines and documentation are made available in the same location to speed-up the process on how to use certain modules
Using the five pillars or functionality sets, the environment changed from redundancy in functions to a scalable and neatly organized environment where functions are being re-used.
Let’s give you a practical example of how the solution works.
We’ve created an Azure module, containing of two sub-modules: KeyVault and StorageAccount. Each module (parent) has a manifest, defining what kind of things you want to be available from the parent. Also, each submodule has a manifest, defining the module, versioning, connections, etc.
And within these modules, we created functions. An example is: ‘get a secret from the keyvault’.
Then, we define another module: vCenter. This module connects to an on-premises vCenter spec, where it can execute multiple functions based on data input. To make this connection, it first requires a password from the Key Vault. We can then easily call the ‘KeyVault’ module we’ve built earlier and reuse its functionality.
If you create an Azure Functions app and you build modules, then everything is being implemented in the function app for you, so you can immediately start calling functions.
The solution comes with a few downsides.
The initial setup takes longer – due to configuring standards, defining functions, and creating a manifest.
The learning curve is steeper since people need to adjust to a new way of working instead of applying their own methods.
And this way of working is only supported in PowerShell 7, so most on-premises stacks (running on PowerShell 5) do not support this WoW.
But – if you ask us, these downsides do not outweigh the benefits of the solution:
As Itility, we love automation. This use case shows how central automation can be beneficial for any company working with high volumes of repetitive tasks, complex workflows, or rapidly changing demands. It empowers to operate more efficiently, scale effortlessly, and maintain robust and reliable processes. With that, you can respond quicker to changing demands while ensuring cost-effectiveness and high productivity.
So our advice: stop making your environment more complex, and invest in automation… IT as a utility!