Ms. Dekkers and Dr. Zubrow talk about the global perspective on high-level maturity organizations.
TEXT TRANSCRIPT: 1 March 2001
Copyright 2001 Quality Plus Technologies and Carol Dekkers. All rights reserved.
Announcer: Welcome to Quality Plus e-Talk! with Carol Dekkers, brought to you by StickyMinds.com, the online resource for building better software. This program will focus on the latest in the field of technology. All comments, views, and opinions are those of the host, guests, and callers. Now let's join your host, Carol Dekkers.
Carol: And thank you for joining Quality Plus e-Talk! with Carol Dekkers. I'm Carol Dekkers, and I'd like to say thank you for listening, whether you're listening over the Internet or listening from the Phoenix area. Welcome to the show. This week's guest for show number nine is Dr. David Zubrow, and before I get into introducing David - you're on the line, correct, David?
David: I'm here.
Carol: You finally got through. It's interesting that with the old technology of phone lines, that oftentimes they're busy. So I'm glad that you're here, I'm glad that you got through. This is show number nine in a series of thirteen shows on building better software. And we've had a lot of different guests that have been here talking about testing, software developers versus software testers, we've talked about Test Management 101, we've had emerging extensions to the COCOMO II estimating model, and everyone has been looking forward to this show as much as the Tom DeMarco show we had last week. So David, I know that you have a lot of listeners who are out there. He's freshly back from India, and we're going to be talking about that in a couple minutes.
First I'd like to tell you just a little bit about myself and what I do. I'm the president of a company called Quality Plus Technologies, and we focus on allowing companies to build better software by measuring what they do. And through doing that, they can find out what they're doing well and what they're not doing well. And we use measurements such as function point analysis, which is really a square foot for software, defect tracking, and a lot of different things that can really tell you about your process. And one of the things that we've developed, which fits in well with this week's show, is we've developed a calendar that on the left-hand side of it, not interchanged with the months, on the left-hand side of it shows the capability maturity model integrations model. The new capability maturity model, paired with where function point analysis can help you achieve the process area. So if anyone would like a copy of that, I'd like to invite you to send an email to firstname.lastname@example.org, and many of you may be listening through this Web site already, or through StickyMinds.com, who is our sponsor, and I'd like to say thank you to our spoonsor.
Without further ado, I'd like to introduce Dr. David Zubrow, who is the team leader for the Software Engineering Management and Analysis, the SEMA group, within the Software Engineering Institute, at Carnegie Mellon University. His areas of expertise include empirical research methods, data analysis, data management, it goes on and on. Since his arrival at the SEI in 1992, he has been a member of the Capability Maturity Model Integration product development team, the CMMI development team. He's the lead developer of the software process maturity questionnaire. He has co-authored technical reports entitled "Software Process Automation." He has been the assistant director of analytic studies at Carnegie Mellon University. So, he has been a research coordinator. He has got a PhD in social and decision sciences, and an MS in public policy and management from Carnegie Mellon University. And if I went through and continued to read everything off his bio, I don't think we'd even be able to finish the show. David wouldn't even have a chance to speak. So I'd like to say thank you for taking the time out of what I know is a very busy day today, David, and joining us. So, welcome to the show.
David: My pleasure, Carol. One small correction - it's software engineering measurement and analysis, not management and analysis.
Carol: What did I...Did I say management?
David: Yeah, and don't worry about it, because other people have made the same mistake on occasion. But since it is our group, I like to be clear on that.
Carol: Well, I was corrected and I'm going to ask a quiz of our intelligent audience. Because this one tripped me up. I was corrected when I had Bret Pettichord on the show, and I introduced him as being part of "Seg" Software, and it was "S-E-G-U-E." Now, how would everyone pronounce "Segue"? Well, I had no idea that that's how "segue" is actually written. You know, when you segue from one topic into another, you spell it "s-e-g-u-e." So I was completely tripped up, and I guess the Freudian slip to do with measurement and management probably is similar.
David: Yeah, well, one serves the purpose of the other really, so...
Carol: That's right. I think I probably just interchanged it because we manage through measurement.
David: Right, and hey, by the way, too, just to talk maybe a little bit about some of my recent work and the trip to India. One of the presentations I gave over there was on, I called it "Putting 'M' in the Model - Measurement and CMMI," because as you're probably aware, one of the additions to the CMMI model is a measurement and analysis process area. Now, located at Level 2 in the staged model, and it's a supporting process area if you take the continuous view of the CMMI. And so I'm interested, I'll send you an email, but I'm interested in your function point mapping to CMMI as well.
Carol: Well, good, good. Well, I find it interesting that measurement is now going to be introduced in Level 2 when it used to be really a fundamental piece introduced really only in Level 3.
David: Well, actually I'll take a little bit of issue with that.
David: Because really, as you're aware, there has always been the measurement and analysis common feature associated with each process area, and one of the things that that common feature does is provides a way for the organization to get systematic feedback on how well its processes are in fact performing. And the phraseology goes something like "measurements are made and used to determine the status of the activities for..." and then fill in the key process area name. So it's really been there as part of the institutionalization of each process area. And in perhaps a less visible way, has always been there in certain of the key process areas, intermixed, if you will, among the various activities to be performed. Like in project planning, you know with respect to estimation, or in project tracking and oversight, with respect to comparing actuals and estimates of the plan.
David: So it's been there, but I think one of the nice things about the CMMI formulation here is that it's the notion of measurement is much more visible, which I personally, of course, believe is a good thing. When you think about process improvement, how do you distinguish change from improvement? How do you determine if something has improved or if it's just been a change? Well, if you're not measuring, you can't distinguish those two things. Or heaven forbid, change from a degradation in performance. You just don't know unless you're measuring, so I think it's really good.
Carol: And let me ask you this. The Software Engineering Process Group conference that you went to in India. That was in, whereabouts in India was that?
David: Well, I did a day's worth of tutorials in Bangalore, which is the, sort of the center, if you will, of the Indian software industry. And a day's worth of tutorials in Delhi, and then the conference program itself was in Delhi.
Carol: Oh, wow. So you were over there for a fair number of days.
David: The whole thing was about eight days long. Of course, on the way there, it took from Friday evening until Sunday morning to get there, and then coming home, Saturday was one of those 35-hour-long days.
Carol: For anyone who's crossed the dateline, they know exactly what we mean.
David: Saturday just kept going on and on and on.
Carol: And you just go all those free international drinks that you can't take advantage of, because you're not going to drink for 35 hours. And it would be awful, so, yeah, I've been on a few of those trips to South Africa and places like that. It's not fun.
David: Well, it was fascinating to have the time to meet many of the people there in India, from a number of companies that probably have recognition here in the States as well. And one of the things they pointed out to me was a new IBM building that was being built right in Bangalore.
Carol: Oh, wow.
David: An office building.
Carol: I'm going to ask you a real quick question. I'm going to "segue" into a real quick question.
Carol: For any of those companies who have been actively trying to get the CMM installed into their organization, and now the CMMI comes in, some of them may or may not know how that happened, and what the difference between the CMMI and the CMM is. So if you could just spend a minute, just kind of bringing them up to date, that would be great.
David: Okay. Let's see. Well, how did it happen? Well, as you, hopefully the listeners are aware, there are several CMMs out there in the world. There's the CMM for software, which I would argue is probably the most well known, and most widely used around the globe. But there was also an effort to create a systems engineering CMM, which then would expand the scope, if you will, of the modeling activity sort of from cradle to grave in some sense, in terms of product development, not just software development, but product development. There's a software acquisition capability model. There's a people capability maturity model. So there's a variety of capability maturity models out there. And one of the things that companies were asking for and as well, the Department of Defense, is...Is there some way to combine these so that from a process...since all these things have to play out in a company or in an organization, is there some way to get in one sense some economies of scale, to reconcile them, to harmonize them, to integrate them in a way that provides, so that they can be used more consistently and coherently within organizations. Hence was born the notion of CMMI. And right now, the current, and in fact it's available from the SEI Web site, are some various drafts of particularly the combination of systems engineering and software engineering. I think that's been good, because the original CMM acknowledged there is a need for an interface between those two engineering groups.
Carol: And would you like to give out the SEI Web site for people that may not know it?
David: Sure. For folks who don't know it, it's www.sei.cmu.edu. And if you go to the home page there, you can look, there's a little drop down menu, and you can pick out CMMI - capability maturity model integration.
Carol: And we'll be back with more of Dr. David Zubrow after these short messages. Please join us.
Welcome back to Quality Plus e-Tech! with Carol Dekkers, and we've been talking to Dr. David Zubrow of the Software Engineering Institute, of the Software Engineering Measurement and Analysis Group. I got that right this time. And I'd like to invite people to call in if you'd like to talk to us or say hi or just give us a comment. The toll-free number is 866-277-5369, that's 866-277-5369. And before we went into break, we were just talking really briefly about the Software Engineering Institute Web site. And for anyone that doesn't know, the SEI Web site is similar to StickyMinds.com's Web site - I don't know if you've been out there. It's really a full-fledged resource. The StickyMinds.com Web site is there for testing and software development professionals and has a plethora of articles, news releases, and that type of thing. And the Software Engineering Institute Web site, which I've been out to a number of times, is the same type of thing, specific for process improvement. The .... piece there actually collects measurement data, doesn't it, David?
David: We do get some things, actually where we've really focused that effort is something the Software Engineering Information Repository.
Carol: Which is the SEIR?
David: And that's the seir.sei.cmu.edu. And while we do ask people to register with us, there's no fee for using the site. And that, on the Web site, one of the things we're trying to do is sort of this idea of establishing a community of interest, really making it sort of a two-way exchange of information. So we ask people to contribute either articles or presentations, or templates, various kinds of documents, data if they have it. It's a repository, and turn that around, organize it, and make it available to the rest of the community. And so we have an awful lot of information, as you might expect, in terms of software process improvement out there. And for the listeners that are familiar with the community maturity profile, which is the briefing that the SEI produces twice a year on how many companies are at level 1, 2, 3, 4, 5, the interactive version of that within the Software Engineering Information Repository, that will allow a user to filter the data, for instance, by Standard Industrial Classification Code. So if you wanted to create your own maturity profile for the financial industry, let's say, you can actually go in and filter the data and do that within the SEIR.
Carol: And you can get to that within, if you go to the main sei.cmu.edu, you can get into the repository and to the analysis, and it's really a gateway that allows you to go in and see all sorts of things. Kind of like a candy store.
David: Yeah, well, you can browse around in there and, I guess to follow that analogy, browse around and sort of shop for what you want. Sure.
Carol: And free samples.
David: And free samples. Absolutely. No charge.
Carol: A lot of people will be thrilled with that. Your trip to India. I think a lot of people are really interested. The high-maturity organizations. There's been a lot of press on level 1s, level 2s, level 3s, and as companies are coming up the maturity level, 3s and 4s are the ones that have been sending me emails and saying, "What about these high-maturity organizations? What is it that makes them different? How many of them are there? It used to just be a handful." What did you find from a worldwide perspective? I know that Indian conferences and a lot of the European conferences are a lot more international in flavor than our domestic conferences usually are. What did you find in terms of acceptance of the CMMI, and kind of the evolution, are there a lot more high-maturity organizations? What's the world perspective on this?
David: Well, okay, let me talk about India a little bit since I was just there and I'd say most of the participants in this particular conference, however, were from within India. Although from various portions of India. Actually, that was another sort of maybe my failing in geography, but I just really didn't understand how big it was and how spread out in some ways the various population centers really are. But it was interesting, having basically done two full days of tutorials, and having, if you will, a captive audience.
David: For that length of time. And talking with the folks. And one of the things that becomes very clear is that's perhaps a little bit of a contrast is they're all very focused on reaching level 4 or 5 of the CMM, rather than sometimes you'll hear here in the States, for instance, "We just want to get to level 3." Sometimes you hear those kinds of comments and that's a little bit of a difference. Now, the other thing that's interesting when people ask you what's one of the hallmarks that's differentiating, let's say a level 3 from a level 4 or 5 company, we are talking about measurement, particularly in my view, is the ability to actually leverage the data for the purposes of decision-making and improvement.
Carol: And I guess we're going to break real quick. This segment actually went a lot quicker than the last one. And we'll be back right after these short messages, with more of Dr. David Zubrow and talking about the CMMI, high-level maturity organizations.
And welcome back to the show. It's been, the last half hour we've talked to Dr. David Zubrow of the Software Engineering Institute. And we've been talking about his trip to India to the Software Engineering Process Group conference that was in India. David did two days' worth of tutorials and participated very heavily in the conference that was there. Before we went into the break, we were talking about higher-level maturity organizations. And one of the things you said that I thought was fascinating is that you ask companies, you hear a lot of people saying, "We just have to get to level 3," and that Indian companies were saying, "We're going to get to level 4 and 5." And in a lot of...I have kids, and one of the things I always hear with kids is that kids will rise to your highest level of expectations. So if you expect them to get Cs in school, they'll get Cs in school. If you expect than to get As in school, you turn around and, wow, they're getting As in school. Is it kind of a matter of expectations? That if you expect to get to level 4 and 5, you do? Or is it a difference in culture, do you think, David?
David: Well, I think there's a big cultural component to it. I think there's a big marketplace component to it. I think in some cases, too, that maybe the work force is perhaps more accustomed to a disciplined process, or is more accepting, perhaps, of a disciplined process. And so is eager to move on. And I think the other aspect of this that sort of works with that, is the notion that the Indian government and software industry have decided to make a very conscious effort to become recognized in the software world. And have latched on, if you will, to CMM-based software process improvement as one way of gaining recognition, and through that recognition, hopefully gaining market share. One thing that occurred, or one notable event during the conference was actually the close-out of the conference. And it was done by Dr. Kohly, I hope I got his name right, but he was the former president of Tata Consultancy in India, and Tada is an enormous organization there, and their consultancy services is basically their software arm. And he talked a little bit about the positioning of the Indian software industry, and he made an interesting analogy, which was I think perhaps along the lines of your statement there Carol, about trying to encourage people, because although there was sort of a celebratory mood, let's say, within the conference, and within those attending from the Indian software industry, he said, "You know what? We in India are earning..." And I'll make up the numbers, but let's say they have captured maybe 3.5%. It was something, it was less than 5% of the worldwide software market. "And we have a billion people in India." He said, "You know, the Israelis have about 2.5% of the worldwide software market, and they have 3-1/2 million people. So before we get too exuberant about our performance and our result and our focus and progress in software here, maybe we should stand back and really take stock of where we are and what resources we actually have to apply and how well we're actually doing." So he was kind of providing a little bit of sobering, if you will, to the crowd. And I think as you said, laying out a challenge for those in attendance, to say "we need to expand the kinds of software we're developing, continue to make a name and attracting revenue and business into the country." And I think a lot of what they have done has been primarily in the IT arena. Perhaps less so in some of the more emerging kinds of technologies, you know, mobile networks and things like that.
Carol: Right. When you were saying, it just kind of hit a chord, when you said that these companies are so new, it kind of rang an analogy with the voting that we had happen here back in November, where the newer states had the most advanced voting mechanisms, and the older states were still using a lot of old things. And what hit me about what you said about India having these new companies, is it's almost like, you know they've got brand-new rocket trains, and they can get those up and going in one direction very very easily because they're laying the track in front of them, essentially as they go. So you can get very quickly to a level 4 and 5 because you've got no baggage to have to go back and redo. And for many of us in companies, we've essentially got these enormous, very very productive, very productive in terms of profitable, companies that have these freight trains that are running on old tracks. And not only would we have to build brand-new tracks, but we've got to take the old tracks out, make people start moving in the same direction. And I think that companies here in the United States, that have been established, and that have reached level 4 and 5, even level 3, have an incredible amount to be proud of. Even over any of the newer companies in India, because it was far easier in India than it would be for any company here.
David: I would agree with you, and maybe even take it a little bit further, in the sense that it's not only the companies themselves, but the nature of the contracting mechanisms we have in the marketplace as well, that are all part and parcel, if you will, of establishing a successful collaboration, and one of the pieces to that...It's very hard, you know, for some companies to resource process improvement when they're in certain kinds of contractual relationships. It's hard for them to find the resources, free them up, and commit to it. And for those that do, however, I agree, they have put their faith and money into process improvement and hopefully have recognized the benefits and gains from it. And I think that's another part of it, is simply the resolve and the will to do it. And we certainly have some very notable examples of that here in the U.S. And so I think it's perhaps easier, you know, the story of Motorola India was it started off brand new as an organization designed to operate at level 3 from the beginning with about 80 people. And it grew on from there. They've done a lot of interesting things in terms of investment and training that you may not see quite as often here as here in the U.S. But again, the idea of starting with that plan in mind, I think, is a big plus.
Carol: Now for companies that are pursuing level 3, level 4, level 5, I've heard some companies say, "Let's just get to level 3 so we can get there, and then we'll put level 4 as the next goal, and level 5 as the next goal." And I think there's a lot to that, it's almost like let's lay the baby steps, because if we lay it out too far up level 5, we'll never get there. What can companies that are pursuing, they may be level 3 now, or just about level 3, heading to 4 and 5, what can they learn? Are there any best practices that these companies in India, and companies worldwide, have done that we can learn here in the States? Maybe save us time and money as we're heading toward level 4 and 5.
David: Well, yeah, there's one thing that, and in fact I can tie this back into CMMI and then to measurement, which is, one of the biggest challenges for achieving level 4 is actually being able to use the data that have been collected to support decision making. For doing quantitative process management. And too often what's happened is that companies have embarked on collecting data at levels 2 and 3 primarily to...without a good purpose in mind. They do it because they have a sense you need to be doing this to satisfy the various key process areas that you want to move up. But then they get to levels 4 and 5, and now they're starting to really focus in on how do we leverage this information as part of our decision making? And realize that to a certain extent, gee, we just weren't collecting the right thing. I sort of have mixed feelings about that, because it would be great if they could foresee that and incorporate, or design the data collection to support their decision making, their project management functions, their process management functions. Right from the get-go. And so that would be the ideal case. On the other hand, at least in the former, at least they're getting some discipline and routine in place associated with collecting data. And so I guess, you should be thankful for what you do have, at least. But I think it does come back to haunt them, as they try to go up higher on the maturity scale.
Carol: Would you agree that Victor Basili's goal question metrics might help out at that level?
David: Oh, absolutely. GQM, I think, is a good way to focus in on identifying first what's the purpose. And that's what you've really got to figure out. Like establishing what are the requirements.
Carol: Right. Let's talk a little bit more about goal question metrics when we get back from our short break.
Welcome back to the show. Thanks for listening. We've been talking to David Zubrow, who has returned from the SEPC in India. And we've been talking about high-maturity organizations. What does it take to get there? What are some of the cultural differences between India and the United States and that type of thing. And I'll mention one thing, because I'm Canadian. One thing that kind of hit me, too, is that a lot of British Commonwealth-based countries, there is a very easy acceptance of anything prescriptive. It is absolutely, you don't even question it. And it's hard to describe the difference between that unless you've lived in two cultures. And being immersed in the American culture, I look back and say, "Well, why did we just follow those things?" The government makes decisions and you follow them. An expert comes in and says, "You will do it this way." And you do it. And I think in a lot of cases, that's why India in particular has just embraced the CMM. "If it's the best way of doing it, we'll just do it." Do you have any comments on that, David?
David: That could be...I'll tall you what, the thing that came to mind was the traffic over there. And it's anything but orderly. But I think you do see a lot of that within the software engineering community. And to a certain extent, that search, if you will, for best practices too, that can be incorporated into the way they do their work over there. The adherence aspects, quite frankly I didn't get into too many conversations with people strictly speaking, although a number of the participants in my tutorials were from quality assurance groups. And were talking a little bit about their roles as not only doing sort of product assurance kind of activities, but also process assurance activities.
Carol: Okay. Now, did the people in India...We were talking about goal question metrics, starting out with your goals of measurement before you actually pick your metrics. And I know a lot of my clients will come to me and say, "We need to implement function points," "We need to implement defect tracking." And they say, "What do we have to do?" And I say, "Well, what do you want to do with this?" And they're silent at the end of the line. Do companies in India...Are they more receptive to the goal question metrics? Do they do more planning?
David: I think along that score, perhaps they're really not any different. As I talked to...We actually incorporate the notion of GQM into a lot of the training that we do. And really try to get people to think through first, what's the purpose of gathering some data anyway? What do you really want to know? Not, what should I measure? We start with, what do you want to know and why? And it's interesting, because quite often, the first question is, what should I measure? And you've got to push people back. And there's the analogy there with requirements. You don't just start coding. You have to figure out, what are we trying to do here? What do we want to accomplish? And the same thoughts really apply to measurement. What we actually have done - a useful way for companies to proceed along this line, we have a course called Implementing Goal-Driven Software Measurement. And one of the things that we do is, we get people to think about what is it they want to know, or what do they want to learn. Draw me the indicator, or the graph. What would help you make that decision? What would help you understand the current status of the project? Whatever it is, whatever the purpose is, draw me the picture or the graph with sort of a mock-up, if you will, with the display, the line chart, the bar chart, scatter diagram, whatever it is, that would help you answer the question. Because that in turn can become the specification for what do you really need to measure, with what kind of frequency, where are you going to find the data in your processes, how is it going to be generated... It really becomes the specification for laying out, what then do you need to go out and measure. So we find that that's a pretty useful elicitation technique in terms of helping people think through what do they want to measure and how to get their measurement activities started.
Carol: And it's aligning what you're going to be measuring with something that's coming down the pike in the future. So if level 4 says you must use the measures to manage and make decisions, at level 3 we need to be saying, what decisions are we going to need to make?
David: We need to do that, and one of the real differences is between establishing arbitrary thresholds, however important they may be, and really understanding, if you will, the voice of the process, which is a level 4 concept.
Carol: And we'll be back to close out with more of Dr. David Zubrow and Quality Plus e-Talk! with Carol Dekkers after this short message.
And we're into our final segment with Dr. David Zubrow of the Software Engineering Institute. And David mentioned at the break that there's going to be a brand-new community profile. What's it exactly called, David?
David: It's the Community Maturity Profile, and it's the report I mentioned earlier, where we provide the results of all the assessments reported to us. And one new feature, if you will, of this particular profile will be, we have information on twelve reassessments, where organizations have gone from level 4 to level 5, and it looks like, based on those twelve, the median time to move between those levels is about 22-1/2 months. And sort of in relation to our earlier conversation about the challenges of going from 3 to 4, the data are showing that that's been taking around 30 months. And it jibes, too, with some personal experience I've had working with companies moving from 3 to 4, that there is kind of a hurdle there in terms of really getting the data organized, selecting the right data, and marshaling it for that purpose.
Carol: So if somebody's in an SEPG role, or a quality assurance role, and they'd like to show their management how long it's been taking companies on average, they can grab this profile. I assume it's in an Excel spreadsheet or something that they can slice up.
David: It's a PowerPoint presentation.
Carol: In a PowerPoint presentation. And they can get that by going to which particular page?
David: If you just go to the main SEI Web site page, in the drop down list you'll see Community Maturity Profile, you'll see Maturity Profile, you'll see Assessment Results. There's a variety of ways and they all get you to the same place.
Carol: That's great. And I think a lot of listeners have really appreciated the time that you've taken with us. I wish we had more time and we could talk about your trip to the Taj Mahal. I thought it was very interesting in the email you sent me where you said, "I blew off one day and went to the Taj Mahal."
David: I'm not sure that's for general consumption, but sure. You get that close and that far, you want to do it.
Carol: Well, no kidding. You're that close, and it's kind of like saying, "Well, I'm here at the ocean, I'm not going to look." But I'd like to say thank you for giving us an hour of your expert time. You won't be at the SEPG conference which is coming up in two weeks in New Orleans, but you will be, and you are the chairperson for the 11th International Conference on Software Quality with the American Society for Quality next year in Pittsburgh.
David: Right. October 22 to 24 in Pittsburgh here.
Carol: And there'll be more information about that forthcoming. Next week's guest, which will be show number 10, March 8, we're going to be featuring Dr. Alan M. Davis, president of Omni Vista, and he's a renowned software requirements management expert. He was the former chief editor for IEEE software, and he's going to be talking about requirements management in Internet time. So that's going to be an inspiring, pretty interesting show. And I'd like to share with you one of the comments that one of my listeners sent back to us. She said, "Listening to the show is kind of like sitting around a fireside chat. And being able to sit back, have coffee, and listen to you talk to the experts." And I really am pleased that David took the time to sit down and share his expertise in an informal, casual way. When you go to a conference, you don't often get that interaction. If somebody's up on a pedestal at the front, and you kind of look at them from afar. And I think that this forum is a really good forum to talk to, really the cream of the crop in this industry.
David, I'd like to say thank you. And do you have any final words of wisdom for our listening audience in terms of CMMI implementation?
David: Well, in terms of the CMM, I think there's a lot of conventional wisdom out there, which is you have to get that management support, and you just have to take those steps. I think it's really a matter of commitment, willingness to improve, taking a look at the way you're doing things, and just figuring out a better strategy.
Carol: And I think you've shared a ton of information with us. We could probably talk for another three or four hours, but we'd be moving into everyone's lunch hour, their dinnertime, and around the world, I'd like to say thank you for listening to Quality Plus e-Talk! with Carol Dekkers. Thank you for joining us this week. Please join us next week and the weeks after, as we feature a lot more experts on building better software. I will e-talk to you next week.
Copyright 2001 Quality Plus Technologies and Carol Dekkers. All rights reserved.