I attended CSUS from spring 1998 through fall 1990. Now 20 years later I returned to attend a software conference. I met my former instructor Dick Smith and former Chair Anne-Louise Radimsky, and several current students and faculty.  The event was organized by Cici Mattiuzzi, who was the career coordinator when I was at CSUS and I’ve stayed in touch with.

A few photos and my lecture notes follow. (PDF version is here)

First Annual Conference on
Quality Software Development

Software is playing an ever-increasing role in our day-to-day lives. The inaugural CSUS Software Engineering Conference provides a forum for individuals and organizations seeking technologies, concepts, and methodologies to improve the quality of their software products, processes, and services, and looking for networking and learning opportunities.

Friday, October 10, 2008 - 8:30 AM to 4:30 PM

University Union, Ballroom

9:00 Welcome
Emir Macari, Dean, CSUS College of Engineering and Computer Science
9:15 Software Development at Google
Dave Ferguson, President, Mahalo Logic (Director of Engineering, Google 2004-2008)
10:45 How We Entered the Cloud: Computing in the Web
Bob Batchelder, Quality Assurance Manager, LexisNexis
12:00 Lunch - location to be announced
1:00 Extreme Project Management
Edward S. Allen, Senior Project Manager, CA Legislative Data Center
2:15 Agile Software Development and Quality at FTB
Nadean Shavor, Network Operations Center Manager, FTB
Nadeem Shafi, Application Developer, FTB
3:30 Are We Building the Right Product?
Mike McCullough, Engineering Program Manager, Hewlett Packard

Sponsored by the CSUS Department of Computer Science, College of Engineering and Computer Science, Hewlett Packard, Chevron, Sacramento Area Quality Association, ISSA & IEEE

Software Development at Google
Dave Ferguson, President, Mahalo Logic (Director of Engineering, Google 2004-2008)

People

Google’s quality process begins by hiring the most capable people. They look for:

• Breadth – knows computer science
• Depth – must be expert in something
• Reputation – gets things done, followed through on projects begin to end

Project managers do not act as hiring managers. Hiring is centralized. Then a central committee places new hires on teams.

Google founder Larry Page reviews all hiring decisions – about 23,000 so far.

http://en.wikipedia.org/wiki/Larry_Page

Once hired, the environment encourages cooperation. Project mangers cannot do promotions, these are based upon recommendations by peers – and a key factor is how much assistance workers provided to teammates.

Dave observed that many employers seem to have “built-in systems that encourage contention” and prima-donnas. The corporate culture at Google strives to avoid these.

Process

Employees spend 20% of their time “doing whatever they want.” If they think they’ve done something of value they pitch the idea to try to get resources.

Projects start with a short (5-10 page) “design document” followed by a design review. They don’t use a specific methodology such as “UML diagrams,” – just state in clear language what the system will do and how.

However they do obsess about coding style, and have employees who are “certified” in coding standards reviewing the code.

They do usability testing and unit testing.

QA is done “when the lines of code are written.”

Google has a separate team that supports libraries for DB connection, logging, and other housekeeping. This is one way that consistency is enforced in a large organization.

Before a project gets released, it is presented to Larry, who can be brutal. He will challenge it to be better and more productive. It also gets passed through a security review.

Their process is highly iterative, with many projects being enhancements to current products, such as the pending products in GMail Labs:

http://labs.google.com/

Due to ongoing acquisitions, they have had to learn to work in a co-located manner.

How We Entered the Cloud: Computing on the Web
Bob Batchelder, QA Manager, LexisNexis

The initial theme was how the “cloud” has common features with what mainframes provided 40 years ago.

The concept is that services are available for use, often billed at “per resource utilized” (much like buying CPUs from a mainframe) without regard to the technical details of security, backup, and such. Quality is achieved by allowing experts in security and reliable to manage those aspects.

The presentation utilized the diagram and information on this page:
http://www.3tera.com/Cloud-computing/

 

Today startups can be launched on cloud providers with minimal investment. A DB can be designed and deployed on Amazon SimpleDB: http://aws.amazon.com/simpledb/ 

Salesforce can provide many software services for a low monthly fee: http://www.salesforce.com/ 

The NY Times used Amazon EC2 to provide the terabytes they needed to archive past issues – they decided that this made the most economic sense, rather than investing in hardware that would need to be maintained and eventually replaced: http://aws.amazon.com/ec2/ 

Within an Enterprise we are also moving towards a Cloud, for example with Microsoft Hyper V:

http://www.microsoft.com/windowsserver2008/en/us/hyperv.aspx 

There is a Cloud Conference: http://www.cloudsummit.com/ 

Additional info in BusinessWeek article:

www.businessweek.com/technology/content/nov2007/tc20071116_379585.htm 



Extreme Project Management
Edward S. Allen, Senior Project Manager, CA Legislative Data Center

Edward gave a full-on presentation of how LDC is using extreme programming. He relies upon books by Doug DeCarlo:

http://www.dougdecarlo.com/

http://www.projectconnections.com/articles/decarlo.html 

Preview of book: http://books.google.com/books?id=Z5SSIsA4M-0C 

The premise is that Newtonian approach is too much planning. Extreme programming uses a Quantum “right brain” approach that is creative and operates with less guidelines, as summarized here:

http://it.toolbox.com/blogs/coneblog/interview-doug-decarlo-einstein-4629

 

Are We Building the Right Product?
Mike McCullough, Engineering Program Manager, Hewlett-Packard

The core of Mike’s presentation was to engage the customer (users, stakeholders) in the test process as soon as possible.

Pre Release Validation Test (PRVT) is proactive process of analysis, review and testing that partner with the customer during the development lifecycle to reduce ambiguities in requirements, design, and acceptance criteria. PRVT validates features and capabilities of the proposed final system before product release to market.

Mike explained how his team is using .NET to quickly prototype web interfaces that have the look/feel of the final system, and allow the user to use them to gather feedback on usability and missing features.

Mike also spoke about “Requirements Gap Analysis.” Here are some Google matches for “Gap Analysis”:

www.9001resource.com/how_to_conduct_a_gap_analysis/how_to_perform_a_gap_analysis.php

http://searchcio-midmarket.techtarget.com/sDefinition/0,,sid183_gci831294,00.html

DEFINITION - In information technology, gap analysis is the study of the differences between two different information systems or applications, often for the purpose of determining how to get from one state to a new state. A gap is sometimes spoken of as "the space between where we are and where we want to be." Gap analysis is undertaken as a means of bridging that space. Among the various methodologies used to perform gap analysis is IDEF, a group of methods used to create a model of a system, analyze the model, create a model of a desired version of the system, and to aid in the transition from one to the other.