Investing in software development yourself?

Photo credits
Shutterstock
31 October 2023  -  10 minutes

HydroLogic research paper

By Arnold Lobbrecht

Organisations that start developing software step on a slide that ends in the ball pit of management, maintenance and unexpectedly high costs. I am not talking about professional ICT companies, who know what software development means in the long run. I am talking about organisations that want to improve specific business processes by starting to use today’s highly accessible software development tools and cloud services. Do you also see beautiful opportunities in data science and want to invest in software development? Then read on.

Advanced ICT is needed, but there are pitfalls

Optimising business processes today requires deployment of advanced ICT. In water management, the introduction of ICT is in full swing, which is necessary to combat the effects of climate change. There is currently a host of tools for deploying ICT and data science in water management. Data lakes, intelligent scripting, visualisations, portals; it is all readily available these days. In itself, this offers opportunities, but there is also another side to it, which I would like to elaborate on in this article.

Indeed, what I am concerned with is the period after the proof-of-concept phase, when it is clear that advanced ICT offers more insight and prospects for action. It is about the phase in which what has been developed must also be managed. The phase in which the data in the lakes piles up; in which control of that data is needed to ensure that quality remains guaranteed; the phase in which demonstrably working back-ups must be ensured; the phase in which version management is needed for the software and in which documentation must be kept up-to-date; test versions of the software are needed, which can run independently of the production computers; consideration must be given to security and 24/7 support in case something goes wrong with a business-critical application; the phase in which more processing capacity is needed during intensive use; etc.

Practical experience with software cost development

Over the past 20 years at HydroLogic, we have gained experience in the above topics. What became clear early on is that professional software development also requires a corresponding professional organisation with a typical ICT business model, organisational structure and way of working. In this article, I focus on the experiences we and also other ICT companies have gained; on costs associated with software development, management, maintenance and hardware.

In practice, the cost of operational software is much higher than many realise, especially in the long term. After efficient years of producing new software functionality, delays tend to occur with organisations spending more time maintaining old software than creating new functionality. Also, data processing and storage costs often turn out to be much higher than expected.

DevOps – the standard model

In ICT, DevOps is the common approach to software development, where practical experience of running the software (Operations) is incorporated into the development of new software (Development). Working in rapid development and management cycles means working adaptively and keeping product quality high. It means that the team of developers also gets involved in running the software and fixing problems if they arise in the process.

To illustrate, we set up a simple thought model. A small team of three software developers is working on new software, but they will also work on maintaining what they have created themselves. In year one, all their time is available for new functionality. But, from year two, they also have to maintain what was created earlier.

In simple terms, the latter includes: investigating user-discovered problems; identifying and fixing any bugs; making the software more robust to run properly under all circumstances; making updates to used code libraries; restructuring the code; updating documentation; managing and scheduling tickets; etc.
In practice, with software maintenance, it turns out that about 20% of the time that was involved in development has to be spent on maintaining that software. So everything that is developed costs 20% of the original development time in management and maintenance in subsequent years. That sounds like a lot, and it is, and if you plot it graphically a staggering picture emerges (Fig. 1).

Fig. 1. DevOps cost trend, at annually constant capacity for new development and maintenance; maintenance of new software costs 20% of development time.

After only five years, 60% of developers’ time is needed for maintenance; and after 10 years, it is more than 85%! Even if an organisation develops extremely future-oriented and highly professional software and could reduce management and maintenance costs from the usual 20% to 15%, after five years half of all available developer time is needed for maintenance. In short, with this simple model, mainly intended to provide insight into the phenomenon, your investment in software will largely turn into a cost of maintenance over time.

More effective with software redevelopment and team growth

The simple example assumes that software once developed is not replaced by new software. That way, the entire software becomes obsolete. Therefore, it is common to partially redevelop the software. If the developers also start spending time on rewriting the software, and if all software code is replaced with new one after 7 years on average, the software will stay up to date and the dependence on detailed knowledge of old software will also be limited. But however you look at it, the time available for development of entirely new software functionality does not increase.

The only way to ensure that the annual capacity for new software development is maintained is to grow the development and operations team. Fig. 2 shows what happens if the team grows by 15% every year and all developed software is replaced by new ones every 7 years. In that situation, an organisation is able to spend barely half its capacity on new soft-ware development after year 5. Moreover, by year 10, the original team of 3 FTEs has grown to over 10 FTEs.

Fig. 2. Evolution of team capacity (FTE) for new development, redevelopment and maintenance, assuming annual team growth of 15% and replacement of software components each time after 7 years. Over time, barely half the capacity is available for new software development.

Hardware costs

Modern cloud infrastructure uses physical servers whose available capacity is clustered. Virtual machines (VMs) can be installed on those clustered servers, which behave like physical servers. For example, dozens of VMs can run on just a few physical servers. This makes hardware setup and management flexible and allows specific tasks to be assigned to different servers.

Whether working with a private cloud in your provider’s data centre or in the public cloud, for example at Microsoft or Amazon, you work with virtual servers and the costs incurred for this are charged to you. These external costs generally compare favourably with in-house hardware with the associated deployment of in-house administrators.

Once a virtual environment has been set up, you have access to all kinds of services that can be flexibly deployed: processors with cores of your choice, RAM memory, networks and firewalls. Everything is charged for separately, including operating system licences, licences for standard software, data transport and the not inconsiderable energy used by the hardware. The advantage of the public cloud is easy access to a world of databases and analysis tools, visualisation software and dashboarding. These are all essential components for the data scientist.

Data science requires access to lots of data

There is another reason why external deployment and especially public cloud providers are interesting for application of ICT in water management, and that has to do with data storage. In our water domain, the optimal deployment of water management infrastructure depends to a large extent on data about the behaviour of the water system. Think of data from water level measurements, discharge measurements, rain gauges, rain radar, satellite images and weather models.

From measurement to weather model, the sources mentioned generate increasing amounts of data, from many gigabytes to terabytes per year. This in itself should not be a problem, were it not for the fact that for analysis purposes, we actually need historical series. In operational water management, it is important to see how a current or imminent situation in the water system relates to what we know from the past.

The data collected annually must therefore be preserved. Moreover, more and more public data sources are becoming available and the resolution in time and space of these data sources is increasing. In practice, it is also very difficult to part with data once it has been made available because there is always a process in an organisation that uses that data. In short, the amount of data to be stored can easily grow exponentially, requiring the necessary storage capacity.

Which way is the cost going?

Obviously, investing in software involves many additional aspects that need to be considered. Hardware costs to run the whole thing are an important part of that, and moreover, those costs only go up over time. We have estimated the cost of hardware for an organisation getting serious about ICT for operational and strategic water management.

Fig. 3 shows the overall picture that fits the previously outlined organisation that starts with three software developers, grows the team 15% annually and replaces all software components with new ones after 7 years. In the calculation, we have assumed for simplicity that one FTE costs 100 000 euro in employer expenses. All data processed by the software is stored and backed up. After 10 years, the costs have more than quadrupled, from originally 0.3 to 1.3 million euros.

Fig. 3. Evolution of total DevOps and hardware costs, where a 3 FTE development team grows by 15% annually, all software components are redeveloped after 7 years, and all collected data are pre-served.

Conclusions

ICT provides great opportunities to make water management much smarter and better prepared for the effects of climate change. Decisions can be made deliberately on the basis of historical, current and expected information and that is a huge gain. On the other hand, if an organisation wants to secure this professionally, it requires great effort and significant costs.

Within modern water management, many digital developments are taking place, and data science is becoming an essential link in the chain. What I want to achieve with this article is to also give serious thought to the future, because it is not just a good start. There is still a long series of challenges ahead to get and keep everything right in the ICT field, and on top of that, the costs involved are substantial and only increasing every year.

Cooperation between organisations can bring relief in the problems outlined: costs can be shared and some software need not be developed twice. As far as I am concerned, this goes beyond cooperation between governments. In the development of HydroNET, we drew the cooperation wide and involved end users, other companies and also institutes. As a result, a lot of knowledge was and is shared and extremely high costs per organisation are avoided.

Cooperation does not take away from the fact that the ‘total cost of ownership’ of software for water management is high, especially for its maintenance and operational use. Those costs add up over time to as much as three times the cost of developing new functionality! Of everything you think you will invest, 2/3 disappears into ‘invisible’ redevelopment, management, maintenance and hardware costs.

Therefore, my advice is: seek cooperation in the golden triangle of ICT companies, governments and research institutions. The credo should be: ‘share and reuse existing and proven software as much as possible’, especially where generic processes such as data collection, processing, storage and making data and information accessible are concerned. The already large budgets required for ICT can thus be used much more effectively and the focus can remain on the much-needed innovation in water management.

Investing in software development? Look before you leap!

About Arnold Lobbrecht

Arnold Lobbrecht is founder and one of the owners of HydroLogic. He holds a PhD in Control of Water Systems from TU Delft. In his 35-year career, he has been continuously active at the intersection of water management and ICT. Besides various positions in industry, he worked part-time for 15 years at UNESCO-IHE, the international water institute in Delft, Department of Hydroinformatics, where he headed the Operational Water Management / Real-Time Control (RTC) research group as Associate Professor. During his career, Arnold initiated several public-private initiatives and led teams of professionals in the Water and ICT domain.
Apart from this work, Arnold has been and is active in various working groups of NLdigital, the Dutch AI coalition, the Royal Institute of Engineers, the Royal Dutch Water Network and NLingenieurs. His research interests are in data analysis, hybrid modelling with physics-based and data-driven models (machine learning), generative AI and Artificial Intelligence in general.