burgerlogo

How Google Tests Its Products

How Google Tests Its Products

avatar

IoT For All

- Last Updated: May 18, 2023

IoT For All

- Last Updated: January 1st, 2020

featured imagefeatured imagefeatured image

https://youtu.be/wAuiJ7chqQk

When building IoT products, product testing is critical to success. Clare Meredith and Kira Dickson of Google join Ryan Chacon on the IoT For All Podcast to discuss product testing. They cover how to approach product testing, what can make product testing unsuccessful, data validation testing, network and security testing, developing a product testing strategy, and the challenges of deploying solutions for large real estate portfolios.

Episode 289's Sponsor: Avnet Silica

The We Talk IoT Business Podcast is back! Explore best practices, IoT use cases, and formulas for success on your preferred streaming provider. Or visit avnet-silica.com/podcast.

About Clare

Clare Meredith is a Program Manager with the Enablement and Governance organization at Google. She promotes good stewardship in technology, primarily in the areas of network and security compliance. More recently, she's entered the world of Data Validation testing with Kira Dickson and aims to build a strong program around ensuring teams selecting technology solutions get the most value out of their investments.

Interested in connecting with Clare? Reach out on LinkedIn!

About Kira

Kira Dickson is a Technology Program Manager within Google's Real Estate and Workplace Services organization (REWS). She is responsible for evaluating and delivering technologies to enable REWS objectives, with a specific focus on Workplace Utilization. Prior to this, Kira was the Intelligent Environments Lead at the Google R+D Lab for the Built Environment, and an early employee at Comfy-Siemens (previously Building Robotics). Kira is passionate about leveraging data and innovation to transform our built environment.

Interested in connecting with Kira? Reach out on LinkedIn!

About Google

Google is part of the Alphabet Group of technology companies which spans all the most exciting, innovative tech out there. It's primary business is online advertising and search engine technology. It also has groups focusing on cloud computing, e-commerce, AI, and consumer electronics such as the Pixel smartphone.

Key Questions and Topics from this Episode:

(01:20) Introduction to Clare and Kira

(03:05) How do you test a product?

(06:19) What can make product testing unsuccessful?

(08:37) What is data validation testing?

(11:56) Network and security testing

(15:18) Developing a product testing strategy

(18:23) Challenges of deployment for large real estate portfolios

(21:39) Learn more and follow up


Transcript:

- [Ryan] Hello everyone and welcome to another episode of the IoT For All Podcast. I'm Ryan Chacon, and on today's episode, we're going to talk about product testing for success. We're going to talk about where teams go wrong with it, data validation testing, network and security testing, how to develop a strategy. My two guests are from Google. Clare Meredith, Program Manager at Google, and Kira Dickson, Technology Program Manager at Google. You all should know who Google is by now, but if you're not, they're a part of the Alphabet group of technology companies which spans all of the very different, exciting, and most innovative tech that is out there. Really cool conversation I think you'll really enjoy. We truly appreciate it if you'd give this video a thumbs up, subscribe to the channel and hit that bell icon, so you get the latest episodes as soon as they are out. But before we get into it, real quick, we have a word from our sponsor.

The We Talk IoT Business Podcast is back. Explore best practices, IoT use cases, and formulas for success on your preferred streaming provider. Or visit avnet-silica.com/podcast. That's the We Talk IoT Internet of Things Business Podcast. If you want to check it out on the website, it's www dot avnet a v n e t dash silica, s i l i c a dot com slash podcast.

Welcome Clare and Kira to the IoT for All Podcast. Thanks for being here this week.

- [Kira] Thanks for having us.

- [Clare] Yeah. Thank you.

- [Ryan] Absolutely. All right. Excited for this conversation, but before we get into things, let's go ahead and have you both do a quick introduction about yourself, talk about what you do over at Google, those kinds of things I think will be great to give some context to our audience. And Clare, if you wanna kick it off.

- [Clare] So I'm Clare. I work for Google and have done for a year and a half now. I'm a program manager in the digital buildings program. Which my responsibility is primarily around a program we call device qualification, which looks at IP capable devices, whether that's IoT devices, HVAC, lighting, any of those that would be deployed in a building for their network and security capabilities.

So that would be the vast majority of my work.

- [Kira] And I'm Kira Dickson. I'm a technology program manager within the same team as Clare, the tech and data team in the real estate organization. I've been with the tech and data team for about a year and a half now. And previously on the Google research and develop team- development team for the built environment.

On the team, I'm specifically responsible for smart building technologies and making sure that we're aligned to business needs and making sure that we go through the process of working with our partners across the business to actionize the technology needs that they have.

I previously came from Siemens. So working on a workplace experience application for the built environment called Comfy.

- [Ryan] Fantastic. Thank you both for that. So let's go ahead and jump right into the kind of topics we want to talk about today and a lot of the initial conversation is gonna be around product testing, how to product test for success, things along those lines. And I'd love it if you could break down or talk a little bit more about how companies can be thinking about how to approach testing a product to be successful. I know there's value-driven testing as an approach, but just maybe to high level it and start the conversation there around how to test a product for success.

- [Clare] Yeah, I can kick that off. I think it's probably important just to differentiate that when we talk about product testing, we are talking about from a deployment, from a customer standpoint as opposed to a development standpoint. And I think really firstly, it's about knowing the problem you want to solve.

What are the business objectives, not deploying a piece of technology because it's cool or just for the sake of it which happens more than you realize. And then knowing what you want the future to look like, knowing what you want day two to be with this technology in place and targeting that as the success.

It, it's, I think sometimes it's difficult to differentiate between a deliverable, so like deploying the technology and the actual benefits realization portion of the project or the program. So that would be where I would say you should start for product testing.

- [Ryan] Kira, anything to add there?

- [Kira] I think the only thing that I would add is when we- so the steps that we think about when we're going through actionizing against these business needs is again, as Clare was mentioning, starting with that business objective. Then narrowing down into the use cases that we have from a variety of different stakeholders.

Tackling then who are our users. So across the business because, we could for a single technology solution have a variety of users that could be using that solution. And so we need to be clear on all of those different requests coming from those different parties. Developing user journeys.

So being clear with- clear, not Clare, with what those users are looking to do with the data to make it actionable. So understanding those business requirements. Business requirements help us develop functional requirements for the technology which allows us to then get into some details around how would we actually test technology solutions to actionize against that need?

Functional requirements help Clare and I move into technology selection and actually identifying who are the partners that we need to work with, what are the specifications for that technology that we need to have? We'll get into data validation shortly, but that kind of is a subsequent process that happens afters to make sure that we deliver value to the business and make sure that we're checking off what those use cases are.

- [Ryan] Fantastic. Yeah. The process you outlined there from a business objective, going through all the way to when you start to be able to test seems pretty thorough, and I imagine requires a pretty good plan of attack in order to do all that well, a clear understanding of who you're building for, what you're building, what you need and so forth.

But I imagine throughout that process, or even maybe absent of that process, there are companies who will try to go down the product testing route and not do it successfully. So from your experience, where along the product testing journey do teams go wrong and what advice do you have for them on maybe how to avoid either common pitfalls or things that could come up that are good to know early on?

- [Clare] Yeah, I can jump in there because I want to reiterate something Kira just said. In my experience, one of the main ways teams go wrong is not knowing- they might know their use case, but they don't necessarily know their user journey well enough, or they may not realize there's more than one user journey.

So it's about really nailing that down and also getting all your stakeholders involved to making sure you look for help, knowing that you're not gonna have all the answers. I think getting as many people around the table in that planning phase will go a long way to removing some of the barriers that you may encounter further down.

- [Kira] Maybe just to add on to that, I think the intersection between business requirements and functional requirements. Is often where things go wrong, of where the business requirements are, they're high level, they're what do we need to do with this data, with this information.

But then actually the technology requirements don't often line up where the functional requirements for that system aren't validated or checked to make sure those are in alignment with those business requirements. And so by focusing on what are those core requirements that we have from our users, and then how do we go about testing or validating a solution to make sure it can deliver what we expect it to really helps alleviate any potential gaps that could be in the delivery process of any solutions that we're looking into.

- [Ryan] Gotcha. Yeah, I was gonna ask you about when you go from the business kind of requirements to the technical requirements, there's, I've talked to many other guests who have brought up similar points of how sometimes there's a disconnect there. And I was just curious how you all approach it, but you answered that question already, so that's perfect.

One thing you mentioned I wanted to dive in on is you talked about data validation testing. So can you explain to our audience what that exactly means, where that comes in, what- how that's done, and the value that provides and why it's important.

- [Kira] So I think of data validation as a necessary step in that overall solutioning process. As I was mentioning, if we are front loading with understanding the needs of the business and the needs of the users, when we get into then subsequent technology selection, we need to abide by some of our core principles, whether that be things like qualifications, so working with Clare, things like working with our privacy and legal teams or security teams, other, our building operating system team.

So there's a bunch of different partners that we need to work with towards that technology selection, which is basically is that technology going to meet our overall requirements. Once we get past that point, we need to make sure that the technology that we've selected is going to align to those business requirements.

And so I think the validate- the opportunity with data validation testing is to make sure those functional requirements of the system align with those business requirements and making sure that we create really thoughtful test plans for how to go about testing that technology in a real environment, making sure that the data's usable, it's actionable, it's accurate, so that when we deploy this technology at whatever scale it needs to be deployed at, we have trust over that data, and it can be used towards that end objective.

And so it's a pretty key process for us, especially when we think about that application layer, which is, what do we want to do with the data? The more that we can put, the more that we can front load the testing of that, the better output we're going to have, both in terms of just the data that comes out of the technology we're looking at, and then also its usability to potential applications.

- [Ryan] Yeah, that's very well said. That's, I, we've heard similar sentiments from other companies we've spoken to with that if- IoT is great because it's allowing companies have access to new data that they may have not had access to before. But in order for these companies to adopt a lot of these solutions, it's important for them to understand what data can be collected, is that data relevant to what we're trying to do? And how can it then be organized for somebody within the organization, the end user, to utilize it, provide value or get value out of it to make decisions because otherwise just collecting data for the sake of collecting data is not gonna provide them as much value so that data validation step seems to be critically important for a project to have a real chance of success.

- [Kira] And I think data validation testing, it doesn't need to be just a single point in time. It's important at the onset of the project that we go through that so we can make sure that we have confidence from the onset. But also when we talk about maintenance of solutions over time, that data validation becomes also critically important.

And so we need to make sure that we have moments of checking in to make sure that what we've tested for continues to be consistent as solutions are deployed, as applications are maintained.

- [Ryan] One thing I wanted to ask you about and maybe Clare, this is a perfect one for you to take is, the data, obviously we talked about it, is super important and how you validate that through the testing phase. But what happens when it comes to actually deploying these solutions and this technology into existing infrastructure or there's existing policies that need to be considered?

So more of that network and security testing element, which I know is another critical piece to doing testing the right way. So how do you think about that? How should other people be thinking about that? Why is that so important and love it if you could expand on that a bit.

- [Clare] One of the things I think about with Kira's work and my work is that we're really testing the assumptions that something will just work. And my whole program came about from that assumption being proven incorrect, where there was a bunch of technology deployed on an existing network, didn't work, and then the qualification program was born.

So, I go back to my point about knowing your user journeys and knowing that there are multiple user journeys and getting all of your stakeholders around the table and making sure you're asking all the right questions, or somebody is. Network and security is this invisible thing that people, they might know it exists, but they don't understand it, so they just hope it just works or go- the problems go away. So really what it is is making sure you do understand the policies that exist. You do understand what systems may need to be integrated and getting the right people around the table.

- [Ryan] Yeah, absolutely. It's, that's something that seems to be a big challenge for a lot of companies, especially bigger companies and a lot of companies that we've spoken to in the smart building space and the industrial space when it comes to the existing infrastructure being a really big, not necessarily barrier per se, but something that's a very large consideration that needs to be thought through.

And sometimes it's not thought through correctly and or not seen as important of a task, but understanding what you're going into and how the technologies you're bringing into that space will interact with those is super important and seems to be, again, another piece that really needs to be thought about pretty in depth to ensure that you have a chance to launch this solution or whatever the solution may be successfully.

So those are two great things.

- [Clare] I mean, I think one of the solutions or probably most obvious solutions to that is a third party network. A standalone network, deploy it on that. And I think Google are slightly different in terms of trying to manage their security by bringing it back onto a Google managed network and not segregating it on a third party network, so it's an unusual way to approach things, I think, but I think that the visibility on control allows for better management.

- [Ryan] Absolutely. So now that we've talked high level about how to test the product in order to help ensure more chance of success, we've talked about data validation, network and security testing, where teams go wrong, a lot of good stuff, how do companies take this, these learnings and develop a strategy going forward to approach product testing correctly when it comes to things like data ownership, measuring- the measures of success, lifecycle management, bunch of other pieces that are come into that, how do you think about the process- the strategy development side and what would you tell our audience to be thinking about or how to go about it?

- [Clare] So with respect to say, you mentioned lifecycle management there, that's probably a topic all on its own. But it takes, for starters, security should take priority. Every time. There are going to be firmware updates, software updates that need to, the owners need to be aware of. They need to make sure they're deployed.

And all that has to be managed throughout the life cycle. So it's almost like applying IT principles to IoT or OT technology. And then it's making sure you're considering all aspects of that life cycle. So if at the end of the deployment or the end of the project or two years after a project, it's deemed as not necessary anymore, or maybe the data's not being used or maybe it is being used, but it wasn't validated, so it's not particularly useful, you need to think about decommissioning as well and how that's gonna happen. Again, data, you mentioned data ownership. Who is the owner of that data? Who's responsible for making sure it's decommissioned in a responsible manner? Going back to the policies, knowing what policies are in place and what your legal requirements are, they all have to be considered during the life cycle, so it's important to be asking the right questions.

- [Kira] It's important to just have all of those different parties outlined from the beginning, because you're not gonna have just a point in time conversation with them at the onset of the project. They're gonna be your partners over time. And so being really clear with who has responsibility to that solution or that program.

How you can, will continue to engage with them, both from a tactical standpoint in terms of lifecycle of that product, but then also lifecycle of that data management. And both are really important for interaction with those, a variety of teams. And IoT is going to change so quickly and at such a fast pace that we need to make sure that we're continuing to maintain that solution, maintain the data, maintain all of the kind of requirements that we set out from the beginning over the course of its life cycle. And so having a very clear plan around engagement with those different parties. And then also how you plan to maintain the information around that technology is, I think, yeah, something that is really key to success.

- [Ryan] Fantastic. As we wrap up here, I wanted to ask you a question as it relates to challenges that you've seen in the space from your vantage point, if you will. And one of those areas I think would be really interesting to dive into is around real estate companies that have lots of different properties in trying to develop a solution and deploy a solution across multiple buildings.

How does that, how is that thought about through what we've already talked about, through the testing, through understanding the end users, being able to validate the data, things along those lines. How do, how does that challenge in of itself or what challenges that in of itself pose from your all's view?

- [Kira] I think it's really difficult. I think at enterprise scale, managing all of the different business requirements and needs is no easy feat. I think there's benefits and disadvantages of a central solutioning party. The benefit is that you have an opportunity to collect global needs, global use cases, global requirements, and then actionize against them.

The downside is also that our buildings, our spaces, our needs are complex. And so managing a huge number of those and actually responding to them and actionizing them can be, it can be a long list of priorities, but I think it is important to, I lean towards the benefit of that central opportunity where you can gather all of those business requirements.

You can rank them and prioritize against them. Identify what are the really important things to actionize and then respond to those through, both going through that data validation exercise, going through that technology selection exercise and all of those things that we previously mentioned in a prioritized order based on really going out to that broad net and gathering the really critical business requirements.

- [Clare] Everything Kira said. I think for large real estate standardization, is definitely an avenue to explore. Both in technology and in data. And the other thing I would say is going back to a point I made earlier about test your assumptions with pilots. Pilots can be enormous learning opportunity.

And you can really save yourself some money by piloting first and rolling out based on your learning. So that'd be my advice.

- [Ryan] Fantastic. Well, I really appreciate both of you taking the time to do this. This is, we haven't talked a ton, a good, we haven't talked enough about product testing and really diving into the weeds here that you and that you all have the expertise in that I think our audience is gonna get a ton of value out of how to really think through these things.

We've touched on bits and pieces here throughout the history of the podcast, but it's really never been consolidated into one good, solid conversation, so really appreciate the both of you jumping in here and talking about a lot of these important topics.

For our audience who potentially wants to follow up, touch base, learn more about anything you all have going on in this space, what's the best way that they can do that?

- [Clare] Yeah, reach out on LinkedIn. Probably the easiest.

- [Ryan] Perfect. Sounds good. But yeah, thanks again for your time both of you for jumping in here and again talking about all these topics. Love to hopefully find more content like this do together. I think our audience is gonna get a ton of value out of this, so I'm excited to get this out to them.

- [Clare] Thanks for having us, Ryan.

- [Kira] Thanks so much.

Need Help Identifying the Right IoT Solution?

Our team of experts will help you find the perfect solution for your needs!

Get Help