Episode #191 of The Water Values Podcast by Bluefield Research is all about KETOS, and KETOS CEO Meena Sankaran stops by to chat with host Dave McGimpsey to discuss corralling water data and the digital transformation blooming across the water industry.
Emerging Technologies in the Water Industry
This podcast is all about corraling water data. As a digital water entrepreneur, Meena shares her perspectives on the emerging technologies disrupting the water industry and specific gaps in cybersecurity.
Meena and Dave also discuss Meena’s childhood in India and how her personal history with water impacts her current work in water quality in an industry historically slow to change.
You can listen to the podcast here or read an excerpt of their conversation below.
The following is a transcript of Meena’s discussion with Dave. Note that the conversation has been edited and in some cases shortened for flow and clarity.
Meena, welcome to The Water Values Podcast. So great to finally have you on. How are you today?
I’m doing great, David, thank you so much for the opportunity. I’m excited to be here, and I’m ready to have a fun conversation over the next few minutes.
I really appreciate you taking the time out. Because you are a new mother, you have a lot of things going on in your life right now, so I really appreciate it.
It’s great to share that experience with so many other women in water – whether it’s running companies, being a director of utilities, being a water operator. This is a shout-out for every single one of those moms who are juggling all kinds of things.
I would love to learn more about your background and how you came to the water sector.
It’s been a journey. I grew up in India and came from quite a modest family in terms of the water access we had. It’s amazing how privileged we are in this country with regard to the comforts we have around water. You open a tap, and you have hot water, open another tap, you have cold water, and you take it for granted.
It’s a separate podcast if we want to discuss how water should be priced in this country. But, in my childhood, we had access to an hour of water that would show up in a tanker, and you had to purchase it. And it was not necessarily clean drinking water. If you could afford a filtration system, it was better. We were used to the traditional way of boiling water. That’s probably what most of Texas was doing last week. You would boil it three times for drinking and four times for cooking or vice versa. But that was the reality of water. I probably had many waterborne illnesses before I was even 15. And that was just the norm because you were just exposed to so much.
When I look at technology, and I’m an engineer by education, and I’ve spent about 17 years of my career in data center infrastructure, looking at how you develop actionable insights out of the massive amount of big data that you collect, whether you’re working in a trading floor, or whether you’re working in an enterprise large tech, fundamentally, what I realized is that movement just didn’t happen in the water sector fast enough or didn’t happen even seven, eight years ago, in conversation. At some point, it dawned on me: why can’t we leverage technology to move the needle for these legacy sectors, especially for a precious resource like water? That was an inflection point.
If you want to be an entrepreneur, you better have a very serious drive. Because otherwise, it’s a very challenging journey with many ebbs and flows. There was a culmination of many things, and I decided to found KETOS, which was an intersection of IoT, data science, and robotics for the world of water. I wanted to answer the question: how do we really create value for the sector, which is much needed and long overdue?
I cannot wait to get to the data conversation with you. First, I do have a question. There are very few guests that have had the experiences you’ve had growing up in India, and I’m just curious, have you ever sat back and reflected on how that upbringing has impacted your life?
We could go on for an hour. I attribute who I am today to my parents – to my mom and dad. The values, the principles, all of those were built up as part of that upbringing. And you never complained about what you got. You made the best of it. That resilience translated into that much-needed ingredient when becoming an entrepreneur. That resilience, that attitude of making anything possible with minimal resources, contributed to the founding of KETOS.
If you look at the youth today, they have a fantastic set of ideas, and they’re more upset about climate change and are looking at what we’re doing for water and future generations. You realize very quickly that it’s essential to not just build something that’s commercially viable but to really bring impact. While you need commercial viability, you have to have a social impact side. Then, you can become an evangelist towards that cause and truly bring that entire lifecycle together. And that’s something I feel privileged to be part of, and hopefully continue paying it forward to ensure building a business isn’t just about taking an idea and translating it into revenue.
Let’s hop into the data side of things. I am curious about what your thoughts are, as we sit here in 2021. Where is the utility in all of this? Where are we in terms of using and harnessing data in the water sector?
It’s a fascinating question because it’s so different, even within the water sector, from segment to segment. The water sector is very, very siloed. If you just look at municipalities, you have different sizes of utilities, for example. That dictates the types of resources. And within the next ten years, a majority of the utility workforce will retire. How are you pursuing the digital transformation now that you’ll need over the next decade?
Utilities have collected a wealth of data over the decades. It’s not for lack of, but the information part is hard. Because many don’t have a sense of how to use the data to enable and empower in day-to-day operations. The data can drive specific decision changes, yet those things are not as actionable or as predictive or insightful as you and I would hope for, and many of our utilities need a ton of help and support to parse the data.
The approach to really manage that data also has to be affordable. You cannot be a large engineering firm that goes in with a very high price tag and ask a utility to bridge that gap. We have to figure out more affordable options to really create that interoperability. For example, when I say interoperability, you’ve got smaller utilities that have data that they’ve collected in PDFs on clipboards that have translated into some spreadsheets, and some data may have gone into SCADA systems, and some data is probably still manually entered and manually recorded. Some data might have come from a few new handheld probes implemented in the last five or seven years. Some data might be in a single or dual parameter via an online analyzer that you deployed in the last few years. Purchases were made at different times, and technologies were implemented at different time scales. And now you have some time series data, some static data, et cetera. The intelligence of bringing all of that data together and harnessing it in a very integrated way is key. However, there is no cohesiveness within or across all utilities.
Data management is the real key. You put your finger on it: there’s a lot of data out there, but organizations don’t know what to do with it, or it’s just improperly organized. So, how can we be better at managing our data? What are some tricks?
Well, the first thing is, how much of that data is useful to you? Not all data might be helpful, depending on how you’re parsing it. The first thing is digitizing everything – just get everything digitized into a massive data lake. That way, you have a common baseline to start. The second step would be, after you have that digitized data, to understand the time series part of it, for example, what’s the frequency of testing that you’ve collected? Some data is real-time, and some is measured once a month, once a year, et cetera.
Collect that data, and then look at how you are thinking about data. There’s operational efficiency versus data that’s looking at resource efficiency versus data that’s looking at asset management. Really separating the data based on the value of what it’s supposed to create versus keeping it all in a common store, where you’re unable to pick and choose the exact benefits the data translates to, would be your second tier of organization.
The third step is taking that and building a pattern of what additional centers you now need that are very targeted and specific to generating net new data that you don’t have. Identify the gap. For example, if I’m going to hone in on water quality monitoring, if I parse through the data I’ve analyzed and gone through that tier three part of data organization I’d understand that I only have pH, temperature, and maybe EC and what I don’t have is a true understanding iron or heavy metals to help in the composition of our chemical feed or chemical treatment. I’ve honed in on the gap and can install the right system to generate net new data.
By the way, if you know of specific parameters that are impacting or affecting the corrosion of your pipes, your maintenance can be better, and that reflects directly on your manufacturing costs and your manufacturing savings. That’s where the real value comes in. And as a plant operator, as a plant manager, you can show a substantial amount of savings via better monitoring – and also value-added efficiency.
What are some of the most common data gaps that you see?
We see that the collected data is from different sources and that data is very disparate. I’ve seen that as a gap in a lot of environments.
I’ve also seen people who have some sort of SCADA system and some utilities and industrial customers that are trying to build their own homegrown platforms. That takes a lot of resources and budget, and they may not have that kind of capital to go and build an entirely new in-house platform. If you do that, do you have all of your ingredients for what you need to be compliant? And are you investing in a whole in-house lab? And is that needed? Can you enhance your in-house lab to be faster, cheaper, and more efficient by managing certain automated instruments and automated systems that now exist?
The other part of data is looking at it from the outside in. How has data management worked, for example, in the transportation sector? How has it worked successfully in energy? These industries are already successful in terms of how they have driven grid management and how they’ve used their data. If you look at it from that angle, you realize the business model also needs to be disruptive for data to be widely addressed in a more succinct manner.
Are there things that we’re really good at? Do utilities have some little gold nuggets that they don’t even realize they have?
Absolutely! Look at pumps. Traditionally, utilities have laid out piping infrastructure, and they’ve got lots of pumps and metering toward quantitative water. What’s really done well is collecting a lot of quantitative data, and we’re trying to correlate information based on that, but the sector seems to have missed a substantial gap in the past with qualitative data.
While we’ve touched enough on the data collection and the data management layer pieces of it, from a sensor detection layer, I would say the quality is where the biggest gap is. There’s a substantial amount of equipment installations already done out there. When you look at the entire stack, it’s becoming more predictive. Now that you’ve collected, say, ten years of data, how are you building a pattern with simple machine-learning models to allow a pattern to emerge? You can look at the pattern to say, well, how can I be more predictive? For example, what’s my maintenance like? Maintenance is a low-hanging fruit to start with. While people are risk averse, there is still a lot of low-hanging fruit to try while using the opportunity to research and look at how to handle more capital-intensive projects.
The majority of utility employees are going to be retiring over the next few years. How do we get those employees to buy into data collection? Do you see much resistance out there from folks that are worried about being replaced by machines? How can utilities get their workforce to buy into the use of more enhanced data?
It’s a very important question. I would answer it in a few different ways. One would be, every single person, whether it’s a utility operator or somebody who’s out there driving for three hours to grab a sample, they truly care about water, and they care about what they’re doing, which is why they’ve spent their entire lives in a single sector. This is very uncommon in the tech sector, or other sectors. You can’t speak of the same in other areas. They all care. The question is, how do you help them channel that energy and make them realize what you’re trying to accomplish is not about replacing them but about enhancing their value?
They have to deliver safe water to constituents. But is that sustainable water? They’re building everything in a very static mode. One of the conversations I’ve typically had with folks who are retiring or even exhibited a bit of resistance is that, if they care about utility or want to give safe water to constituents but they’re retiring in five years, how are they passing all of that knowledge of 30 or 40 years, that extensive amount of experience, to another generation of folks?
If we don’t implement something like machine learning that can actually take the value of every aspect of what they’re implementing within controls, those controls will never be understood. They know every aspect of every anomaly that they have seen, and it’s all in their head. And so if we can capture that through a model and be able to input that data into a system, and consistently have them as that water expert, there’s value in that. As technologists, we need to create business models that are not necessarily replacing operators. That’s not the process or the outcome. You’re really there to let them be the water expert and have us be the technology.
For example, in the KETOS model, we install this system, we maintain it, and we manage it. So there’s no requirement for calibration and cleaning and labor-intensive processes (that operators get pulled into). There’s also no need for sitting and trying to get ramped up on net new technologies, which might be too much for many folks that don’t come with an IT background. They’re not data analysts. We can’t expect them to convert overnight. Instead, they become mentors. They become the training minds for all these machine learning models that are going to make that utility a sustainable utility.
That’s a great point. One of the other folks I’ve talked to also made the point that machine learning and artificial intelligence really just allow humans to focus on the things that humans are really good at, and essentially elevate the stuff that they’re doing to the highest value. You’ve talked a little about the KETOS solution so far. Could you just tell me a little more about it?
We deliver a real-time water intelligence solution so that people can really feel empowered to take action based on data-driven insights. As part of that process, we’ve realized that we cannot give folks a siloed solution. We have to give them a comprehensive offering.
With that intent in mind, the solution is designed as a vertically integrated stack. So you can have the hardware and all of the connectivity from the hardware that’s detecting the data and transmitting it into the cloud. All of the communication, the transmission, is fully secured and encrypted. All the back-end processing analysis you need to do for the data is done in the cloud. We have a robust platform that we’ve built that takes all that data and delivers it to your mobile phone or a web browser, depending on your use. The information is in ppb, parts per billion. We can say, “here’s two parts per billion level of chromium six, that’s detected in your water at this instance in time.”
One of the most unique things is the equipment is very autonomous and can be remote-controlled. These are all bi-directional communication. For a water operator, let’s say there’s a COVID outbreak, and their plant has issues. He does not need to show up. He can be in his house and use his mobile phone, and activate all of the testing that he needs for the different water sources remotely while looking at the data of how his water is performing in terms of iron, chromium, and manganese. Or if selenium has any anomalies based on targets. That’s the value that we can provide for this industry. Once customers use that data, we can then derive a lot of the predictions and the intelligence from there. But it all starts from that single data point that didn’t exist before. And the ability for them to schedule testing, manage the frequency of testing, manage the variance of the parameters that they want to test, and the number of users they want to bring on board.
For us, it’s really about shifting a mindset. And for these customers, testing does not have to be cost-prohibitive. The testing frequency does not have to rack up dollars because they’re not purchasing consumables. If we can truly make it more about data as a service, and remove the element of all these siloed purchases and capital-intensive projects, then we really have something here for these guys to adopt that scale and transform how they operate.
Is there a difference between how data is used inside the fence versus outside the fence?
People are very well equipped when it comes to treatment plants relative to other areas, which is the first point of entry that we’re seeing a lot of interest in because there are immediate gaps.
And there’s a big void, too, in size. For example, in larger utilities, where they tend to have an in-house lab, they already have resources, and they’ve already invested in a whole bunch of equipment. They’ve got treatment plans, already building a process around it. They have permits already in place and have their compliance with a whole variety of things that they need to get measured on. Still, there’s a lot to improve, but at least a system is in place. They’re not starting from scratch.
On the contrary, when you start going outside the fence, when you look at distribution or when you look at how many monitoring stations there are, and different locations, and what type of customer complaints have typically come forward or how to measure groundwater monitoring versus surface water monitoring, that’s where it gets tricky. The areas where utilities or industrial operators could use a lot of help is dependent on where their source water is coming from. If you’re thinking about bottling plants, depending on if you’re purchasing the water from the city versus getting the water from a groundwater source, different testing is needed and there are still a lot of gaps in which KETOS can be very valuable.
One thing I wanted to ask about, especially in light of what happened in Florida recently, with all the kinds of cloud-based technology and data, is what are you seeing in terms of cybersecurity risk? What do you say when clients ask you about cybersecurity?
This is not the first time someone’s thought about cybersecurity. You’ve got thousands of banking applications, FinTech applications, and an entire world of enterprise applications that have moved over to the cloud management sector or are actively being managed on the cloud for over a decade or two. But the water sector is just now experiencing this shift. Organizations are dipping their toes in and are a bit more risk averse – rightfully so regarding drinking water and public health.
That said, let’s make sure that we don’t have to deal with human errors created by password issues. Let’s have some good healthy security practices within our utility in terms of password control, security, role-based access, and how those permissions are tied to users. Those are some of the basics in which every utility or industrial customer should be looking at security.
The second aspect of it is, if you have a variety of instruments that are sending data to your cloud, how is that data transmitted? You can do some sniff packets and check out how that data is being transmitted. Is it encrypted? How are the customers engaging with that data? Once that transmission is done and your next layer is within the cloud, do you have redundancy? Is there a disaster recovery plan that this technology provider is giving you? Are there multiple cloud management sources that you have?
For example, AWS is one of the most commonly used cloud. You also have Microsoft Azure and that’s very highly used. Do you have a backup? Do you have that data dispersed between the two so that there are always layers of resiliency and redundancy between how your data is managed and stored? It’s not for the utility to build it, it’s for the utility to work with a technology company and ask them, “where do you have your data hosted? What’s your cybersecurity plan? What’s your resiliency plan? What’s your disaster recovery plan? How are you looking at your business resumption plan? How are you looking at any intermittent issues? What happens if scenario A when I lose the data? Do you store the data?” These are the questions that need to get asked. These are the questions that every operator looking at the digital transformation needs to be very, very smart about in terms of helping design their architectures so that they’re really future proofing it and, to a certain extent, really building that backup from a primary and a secondary standpoint.
Lastly, with respect to the Florida plant as well, how much automation is good, and how ready are you for how much of that automation? Is valid or viable for your own industry or your own clients? Do you want a full closed-loop automated control? Or do you want to monitor and still have an analyst actually look at that data? Do you want a hybrid model? Do you want to fully control the model? How ready are you for either? All of these discussions and questions need to be asked and discussed, and talked about.
This has been tremendous. I have learned a great deal from you. And thank you for your time. Given your status as a new mom, if you have one last message, what is it?
I would really encourage everyone to realize that what we’re undergoing right now, as part of this industrial core data revolution, is a call for water data. Remember, these are not new inventions. Don’t be afraid. This is an innovation that has been deployed and practiced in other sectors. Security issues and other factors have been put into practice in other sectors. Problems that have been resolved elsewhere are being solved for now in the water sector. So challenge the status quo. Be curious and ask more questions as you’re designing the new architecture. There are solutions and technologies out there you can use. And this is the right time to act because it will only help you be great at what you do and deliver a much higher value of service to your constituents – and, ultimately, probably generate more revenue. It can help industrial or utility companies be more profitable and lucrative, especially post-COVID and in this market climate.
About The Water Values Podcast: Water Industry News and Insights
The Water Values Podcast is an ongoing series in collaboration with Bluefield Research.
The podcast explores diverse perspectives in the water industry to uncover the actual value of water. Hosted by Dave McGimpsey, The Water Value Podcast uses each episode to dissect one specific aspect of water at a time. Past episodes have taken deep dives into water treatment approaches, explored water reuse, examined water resources, and more.
The Water Values Podcast is a fantastic resource if you want to learn more about water and the water industry.
Find more episodes of The Water Values Podcast here.