Microsoft career summary
Roles I've had at Microsoft:
UX researcher - IC and M2 manager roles covering a broad scope of Office and other products.
UX design - M2 manager role with a focus on Office client apps.
Market Researcher - IC role conducting IT pro and developer audience research
Program Manager - IC roles on the Excel team and driving Office suite-level NPS research
Product Planner - IC and M1 manager roles focused on IT pros, desktop deployments, and shifting engineering culture and practices to be more customer-driven
Key capabilities developed through the range of my Microsoft experiences:
Identifying the nexus of human needs with tech solutions. Via a myriad of research methods as well as helping to define and leverage design and assessment frameworks, I've learned how to define needed+possible software experiences for various audiences. Examples of such work range from creating UI elements that 15 years later still are used in Excel, to helping to define product roadmaps aimed at IT pro audiences, to pushing for the customer-driven culture shift.
Building and leading teams, and just being human. Both in formal roles running a 30 person team but importantly in less formal roles of leadership, I've gained valuable leadership experience. When it comes to team building and identifying talent, I’m willing to be creative, looking for a combo of raw talent, general team fit, someone who can grow from wherever they currently are. I think of myself as practicing 'quiet leadership,' leading by example, and practicing total transparency. I'm mindful that team health isn't just the aggregation of the needs of each individual. E.g., the importance of bringing the business context and needs into the thinking of how to make a team operate effectively. I've learned to deliver tough messages, and how to think about strategies such as how much to attend to top talent versus those who are struggling. From what I’ve done - wrong and right - as a team leader, I've learned the importance of honesty, transparency, and just being authentic and…human.
The importance of fundamentals. Consistently in my career, I stress getting fundamentals right. Bling and flash don't mean a thing for productivity solutions if you ain't got that fundamentals swing. A good metaphor is to think holistically and to treat fundamentals as highly necessary (such as performance and reliability when it comes to software) but far from sufficient.
Design. My time as a Design Manager gave me the golden opportunity to work with and learn from designers. This experience gave me an understanding of how to think and build from the ground up (again, the importance of fundamentals!) and outside-in (thinking out of the box and not being tied to incremental thinking). While I wouldn't market myself as a product designer, I can and do partner with designers effectively, working within how they think and work.
Engineering. Except for a brief stint in market research, my career at Microsoft has been within engineering, affording me the privilege of working closely with a myriad of engineering teams and leaders. Though I’ve mainly been in research roles, I also have experience being “on the other side” from my days as an Excel PM. And as a researcher I have experienced different types of relationships with engineering teams, from a fully embedded model to something more akin to a studio model, and things in between. I can speak from experience as to how to set up a research team for success and impact.
Leading with and without (official) authority, and landing impact. Regardless of where you are in the reporting hierarchy, your value to the business also depends on contributions you make as an individual. To be effective and have impact at Microsoft, you must understand how things work and why, know what the lightning rod issues are, what the influence model is on a particular project, etc. Getting the same thing done at one place and time may require very different tactics than trying to accomplish same at a different place and time. It's all about knowing having clarity on goals and then adjusting tactics as necessary. I’ve been lucky enough (?) to work on high visibility projects that had close VP-level scrutiny, and generally have had opportunities to drive impact across experience, business, and technical domains at Microsoft.
Principal Product Planner - Planning & Research/eXperience Collective (2019 - 2021)
In January, 2019 I took on a role as a driver of Rajesh Jha’s Customer-Driven initiative. We were operationalizing one of the key Microsoft culture shifts that Satya Nadella has identified; namely we were enabling everyone at Microsoft to engage directly with our customers. The Customer-Driven targets 17K+ employees in Rajesh’s Experience and Devices (E+D) engineering organization. Our goal was to provide every single E+D employee with the motivations, the skills, and the opportunities to learn from and learn about our customers. In reducing the distance between the engineering teams and our customers, the intent was to have customer obsession/connection happen at scale and be engrained throughout the design and development process of all our productivity experiences.
After a year and a half on the above initiative, I moved into a role leading research for other Microsoft 365 experiences. First I worked with multiple product teams to improve collaboration experiences for the main Win32 Office apps (Word, Excel, and PowerPoint). This included experiences such as sharing, sync and async file collaboration, @ mentions in document comments, file protection, and more.
In 2020, I shifted to a role focused on understanding very small business (VSB) customers and working with product teams to improve onboarding, engagement, and ongoing usage scenarios. We needed to reimagine Microsoft 365 experiences for VSBs and therefore needed a deep understanding of customer needs, pain points, and desires. I had a wonderful collaboration with my colleague Veronika, which also provided me with a strong belief in the power of intra-Research collaboration. Because two or more Researchers working together are so much more creative and impactful, I now have a slogan “No Researcher Alone!” And this was also a dream project because of marvelous partnerships with Program Management, Data Science, Marketing, and other disciplines.
Concurrently, I was also on a team that was creating what we called Inclusive Research. This was a Diversity, Equity, and Inclusion effort to ensure that in our research practices we included the voices of all, and not just heretofore dominant populations when you consider factors such as race, socioeconomic status, geography, gender identity, sexual orientation, and more. A fun and challenging project.
Principal Product Planner - Office Experience Organization (2017 - 2018)
In January, 2017 I moved into a Product Planning role, joining the Planning & Research (P&R) team that supports the Office Experience Organization (OXO). The P&R team works on projects to specify the vision and to create usable, useful, valuable, and productive experiences across Office. My focus was on making it more compelling and easier for large companies to adopt Office ProPlus (which has since been rebranded as Microsoft 365 Apps for enterprise).
My accountabilities included:
Defining engineering priorities related to ProPlus deployments.
Providing relevant insights to marketing.
Running the Planning & Research extension staff.
Principal Program Manager - Office 365 NPS (2015 - 2017)
As we evolved the Office business from a boxed perpetual software offering to a hybrid cloud+client subscription service, we also evolved how we tracked user sentiment. Like any modern service we had an urgent need to understand how to have our user base become our advocates and promoters, and how to derive actionable insights from customer sentiment.
Using quant and qual methods, we also studied consumers and businesses using competitive offerings, and we did quarterly read-outs to executives and multiple product teams across the Experience & Devices division.
My accountabilities included:
Research strategy and execution. How do you understand sentiment of 1 billion plus customers? What methods will provide the right, actionable insights to deeply understand the levers to drive satisfaction up? I needed to answer those questions and then work across multiple teams as we execute on that strategy.
Analyzing data and generating insights. To build user stories and engineering action plans, I analyzed qualitative (mostly verbatims) and quantitative data, in addition to working with data scientists, user researchers, market researchers, and others as we aggregated findings from multiple sources for a holistic view of our customer base.
Defining targets and building action plans to meet them. When working with multiple engineering and marketing stakeholders at multiple altitudes in the organization, something as simple-sounding as defining a sentiment KPI can be a daunting task. And once we agreed on a goal, I needed to help ensure that teams take customer feedback into account within their forward planning and backlog priorities.
Skills that I needed to be successful in this role:
Project management.
Data analysis.
Cross boundary collaboration.
Strong presentation and storytelling skills for a variety of audiences.
Patience.
The Way Back Machine
Prior to Microsoft, I worked in consulting and at start ups, mostly in Silicon Valley. But almost all of my relevant professional development - through various types of engineering and marketing roles - has been at Microsoft. My professional journey at the company is shown below, as it began in the early 1990s (!).
Microsoft Usability (1993 - 1995)
Usability engineer (contractor)
People manager (managing my fellow contractors)
Ask me...
Was there anything good about Microsoft Bob?
Contractor to central usability group
Usability studies for products across the company
Learning and leveraging various lab and field methods
Tested prototypes and early design idea
A set of us worked for an agency in CA, and I did the local people management in Redmond
Hired as full time employee in 1995
Supported Excel and Project teams
Learnings and accomplishments
I 'learned by doing' as a contractor and then as a new employee at Microsoft, taking advantage of the opportunity to work with a myriad of development teams on projects that were at different stages.
Microsoft Program Management (1996-1999)
Ask me...
What were some of the challenges in designing Excel for the browser?
What in the world is 'see-through selection' and why did we invest the effort to do this work?
My role on Excel 1997 and Excel 2000
Program Manager
Accountabilities
UI + release PM for Excel 97
On the web Excel team for Excel 2000
Initial designs for saving Office files to HTML
Helped design a Java version of Excel (which we abandoned)
Then worked on an Active X version (which we shipped)
In-grid interactions
“See through” selection
Excel calculator (designed, not shipped)
Function completion, in-context UI
Learnings and accomplishments
I learned to create and ship experiences that I designed and that I was proud of. I learned to PM at Microsoft, which (at the time) was all about having the tunnel vision to ship on plan and on time. It meant making lots of priority choices, dealing with checklist after checklist, living in the bug database, playing defense against requests and "helpful" ideas from all angles.
Microsoft Product Planning and Market Research (1999 - 2002)
Product Planner, Office
Senior Market Researcher, Server tools, Developer audience
Ask me...
How should product planning and market research relate to product design?
What is a Tasks/Methods study?
Product Planning for Office 2000
Helped start the product planning function for Office, working on a 3-person team tasked with forward-looking research
Drove a Task/Methods research project, which was a formative study for Office engineering to look at user goals and methods both independent of - but also correlated to - technology solutions
Market Research in the Central Marketing Group
Managed market research for DevDiv and Server Tools teams
Quantitative trackers and one-off studie
Mostly for product marketing teams
Learnings and accomplishments
As a product planner, I took on a quantitative study that had a multitude of stakeholders, and that required crafting storytelling from many angles of analyzing the data. The fact that this work was referred to for many years was a testament to its quality and value.
As a market researcher, I learned about defining projects with often conflicting marketing stakeholder needs, plus how to be a researcher doing quantitative studies requiring project management with vendor resources.
Microsoft User Experience Research (2002 - 2004)
Senior Usability Engineer, product research for Office 2003
Ask me...
What are some of the traps of scorecarding user experiences?
Summary
The 2003 release was my return to engineering from a few years of market research. I was 'embedded' with the Office Online team, defining customer segments and measuring customer satisfaction. I worked with the Word team helping to scope the collaboration that Word was doing with the Windows team around file save and file open experiences. And I worked across Microsoft on projects such as defining Customer/Partner Engagement (CPE) goals to be integrated into the employee review system.
Interestingly, my work with Office Online was a nascent 'big data' effort before we had such nomenclature. The Office Online site was responsible for integrating feedback and social experiences. For example, I was part of the team defining how we should enable customers to rate and provide feedback on Office Online content and experiences, and then I was a main consumer and evangelist of those data feeds.
This release was also the first encounter I had with UX scorecarding.
Learnings and accomplishments
For metric-driven execs and engineers, robust data feeds are sexy, and the ability to leverage them is invaluable as part of a researcher's toolkit. They play an important part of running a live and responsive service. But they're difficult to implement well, and it's easy to overpromise results from them.
Creating a scorecard is harder than it looks. And driving a scorecard can generate more enemies than friends. As with data feeds, they're useful and have their place in a research toolbox, but they're a ton of work to do properly.
Microsoft UX Research Manager (2004 - 2007)
In the Office 2007 release, our research team had enormous impact on revamping the product, but we made a historic usability gaffe under my watch in taking on managing user research for Office. But in other ways we made significant strides forward with modernizing our approaches to user testing.
Ask me...
What are the possibilities and the limitations of usage data for making product design decisions?
How does learnability relate to usability?
What are some factors to consider in doing remote, high-n user studies?
My role on Office 2007
Senior Research Manager (managing the User Experience Researcher team for Office)
A big bang release for product research
Going to the Office ribbon was an idea that was born from user data, and whose iterations depended heavily upon research. Research defined the problem, and research was continuously consulted to define the solution.
Years of studies on menu scanning behaviors and user perceptions of 'clutter,' plus learnings from previous Office designs such as the disastrous short menus, helped to demonstrate the problem and opportunity that ended up being instantiated as the Office ribbon UI.
"Where's the Print button?"
This is something we got wrong, big time. We knew to expect a wailing and gnashing of teeth about the overhaul of the UI. What we didn't predict is that our clever morphing of the File menu into the Office jewel would be an unmitigated usability disaster, with a learnability barrier that would generate tons of support calls and user frustration. Just as you're seeing Windows have to roll back significantly some of the design decisions made in Windows 8, the Office jewel had to be rolled back in Office 2010.
Getting back into managing people...and developing leaders
For the 2007 release, I switched from being an individual contributor to being a manager of a fairly large team. I had 25 researchers who were responsible for working across about 50 different Office dev teams.
I also joined the cross-Microsoft User Experience Leadership Team, and took on a project of leading a program to identify the next generation of Design and Research leaders and implementing a development program for them.
Scaling with remote studies
As part of our research strategy during this release of Office, we wanted to have some high n benchmarking. But we wanted to do this more efficiently than we'd done in the past. The problem was that we wanted to test experiences that used Office server functionality (e.g., Exchange and SharePoint) as well as native client bits. Plus we wanted to collect usage and perception data in very specific ways. These requirements meant we need to roll our own solution. And we weren't provided with any development resources. So we really had to roll our own.
The project was a short term win, a long term loss, and provided valuable lessons. We made it happen and got the results we wanted for the studies that we designed, but our solution died on the vine after that, as we did not have a good plan for maintaining it. We did, however, learn a ton about bang for the buck and limitations of testing users remotely.
Learnings and accomplishments
From the ribbon work in general, it was a huge accomplishment from our team of researchers. The prediction that users would quickly get over the learning hump of the transforming the File menu into the Office jewel was both a design and a research disaster. We were too myopic in sticking to design goals of integrating branding into the UI and thinking that the general population would get over the learnability hump on their own.
The UX leadership development program that I drove was a huge success. We did several versions of it due to the demand from UXers across the company.
As far as remote studies, the big lesson was to plan better and get the right resources in taking on any development project.
Microsoft Principal Design Manager (2007 - 2010)
Ask me...
How do you drive impact from field research?
What's a 'dev kitchen'? Why is this an effective method?
How could you integrate Twitter feeds into a research plan (and why would you)?
My role on Office 2010
Principal Design Manager, Office Experience Group
The Office 2010 release was my first experience managing both Designers and Researchers, and our team took on some significant efforts driven across Office by both of those disciplines.
Beyond 9 to 5
In an earlier release of Office, our research and planning teams collaborated on what we called a '9 to 5' study, going deep on the workday habits, needs, and perceptions of information workers. As with the Tasks/Methods study I had done for Office 2000, our '9 to 5' study was constantly referred to. It had insights that were impactful across Office and across Microsoft.
In preparation for the 2010 release, we wanted to aim productivity at 'whole life' experiences of our customers. In other words, we wanted Office to be relevant in the workplace and in the home.
The Beyond 9 to 5 study, with a base of worldwide ethnography supplemented by other methods, was quite impactful for the 2010 release, and was another study referred to over and again for years after its completion. (Where we were missing the boat at the time, was not focusing on phones and general mobile needs and solutions, and what was going to happen as a result of iPhones and iPads.)
Defining the product vision from Design+Research
The 2010 release was the first time that the team I was managing - the Office Design Group - drove the release envisioning process. From our own work and working across execs and senior engineering leadership, we release vision collateral that included design sketches and prototypes that were woven into an overall story for the release.
The cool thing about the vision that we helped to put together is that we did a retrospective look to compare what we shipped to our vision and found it to be quite prescient. In the world of a 3 year release where our aim was to assert what we needed to do and then execute on it, this was counted as a huge success. Of course, from today's vantage point, it's easy to say we were measuring success in the wrong way.
Developer Kitchens
A cool method that we built into our research process was having 'kitchens' where users came to us with their work. In particular, well before we were ready to ship and while we could still make significant changes, we had developers from industry and ISVs come to campus to use our new stuff to create the Office solutions that they wanted. This method was an interesting mix of real user context in a lab-like environment, yielding efficiency with making changes to real user problems. It is a great general way to impact an engineering team that in the midst of coding.
Send-a-Smile and social media
The 2010 release was where we also went big with our Send a Smile functionality, and starting to leverage social media to learn about user perceptions of our software. With UI integrated into dogfood and Beta releases, we were able to capture more robust feedback from users. And mining Twitter and other social feeds, we could more quickly see 'voice of the customer' reactions. While these methods had obvious biases, they were invaluable to point us to potential questions for further research or other investigation.
Learnings and Accomplishments
I was quite pleased by the step forward that our design and research team took with creative methods such as dev kitchens and our beyond 9 to 5 research, plus how we were able to bring together the release vision for the entire org.
The big lesson, though, is that we still didn't have our vision wide enough to see where the world was really going with smartphones and 'good enough' solutions from our key competitors. We did great work within the zone of information workers working on PCs, and staying in that zone for so long has come back to bite us.
Microsoft Principal Research Manager (2010 - 2013)
Ask me...
How can you rally a development org around user scenarios?
How did we push out to 'dogfood' and external Beta users ways to explore key scenarios that we were building?
What are strategies for scaling the work of a small research team across a large engineering team?
My role on Office 2013
Principal Research Manager, Research Envisioning Design (RED)
For Office 2013, we brought in a new Design Manager and I continued to manage the Design Research team of some 30+ souls. In this release, our researchers continued to go big with large cross-division efforts.
Office Big Bets
An interesting process that I was in midst of through the entire release was our decision to pivot the release on "Big Bets." It was an interesting and rocky journey.
One challenge was wrangling a multitude of execs and almost 70 different development teams across Office, and to try to come up with a single shared set of priorities. And I was on point to be a main wrangler throughout the release.
Frankly, we weren't nearly as successful as I hoped. The worst news is we started with too many Bets, and it took us months to decide to get better focus. While that wasn't totally on me (it's hard to get a bunch of VPs to all give up their pet projects), I probably could have pushed for earlier cutting and priority aggregation. On the other hand, the Bets did map to real customer value and we were able to keep pressure on teams to stay attuned to customer focus via the Bets.
A big win throughout the release was to be able to leverage the Bets framework to bring multiple teams together around key scenarios.
Partnering with Test
Partnering with Test to scale research work and to influence the scenarios that Test focused on was something we'd done in prior releases, but we really pushed harder for this in the 2013 release. It's one example of how to integrate research into the flow of software engineering. The kernel of the idea was to ensure that Test was also looking at what we were building through the lens of the Big Bet customer-focused scenarios.
Learnings and Accomplishments
In championing the Big Bets, I learned the power of working with and through executives to try to change how we prioritized work and focused on experience outcomes. Using scorecarding as a communication method was a mixed blessing. (If you've ever created a scorecard to be consumed by others, you'll know what I mean.) But having user experience reviews that had teeth due to upper management participation worked well to get significant changes made, even late in the process.
Microsoft Design and Research Manager (2013 - 2015)
Ask me...
What's holistic 'qualitative-to-quantitative' research?
What's a Call for Interest?
What's an Experience Review?
How do A/B testing and other 'big data' methods fit with other product research methods?
What are the struggles and opportunities in moving a research team from long release cycles to an agile development cycle?
What's a dream team for formative product planning?
My roles on Office 'Gemini'
Principal Design Manager
Research Manager, UX analytics team and Market Research team
The Gemini release is happening while Microsoft is undergoing tectonic change and is under tremendous market pressures built up by the decline of the PC era and the rise of the mobile and cloud-based era. Across the company, the Design disciplines at Microsoft are generally on the rise. And in context with the growth of Data Science, across Microsoft we are redefining Design Research at a time when we are also redefining how we do engineering in general.
Data analytics
For Office client software and services, we created a data and analytics team as part of the strategy of doing end-to-end, from qualitative-to-quantitative research. In addition to integrating another new research skills, capabilities, and methods, creating our own data analytics team is also about doing the type of research needed to become an 'experiment and learn' team.
With a small 4-person analytics team, we do some work to scale across the org and some work to go deep into particular problems. For scaling, we're the team that democratized data by defining and publishing common data reports. And we have been working to help create the right data pipeline by defining key usage and business metrics to be tracked by telemetry data.
Our data analytics works across PC client, phone, and other mobile platforms, and does some standalone analyses in addition to doing work that bolsters other types of research. Some examples:
Ribbon heatmaps, providing a way of visually tracking usage data by overlaying the data on Office UI.
A/B experimentation, teaching our Design Researchers and other engineers what A/B is good for, how it can fit with other types of research, and how to design useful experiments.
Answering questions such as the impact of form factor or OS on Office usage patterns, patterns of Office session length, and usage patterns around particular tasks such as saving, sharing, and collaborating on Office documents.
Designing Office 365 'onramp' experiences
Working with multiple engineering and marketing teams, we are re-working the experiences for customers to choose, try, buy, install, and manage Office 365 SKUs. Many of these designs are in market. I hired and managed a combined design and research team for this work.
Release planning
I wasn't a main driver of the planning process, but learned a lot from how we did it. A key was having true multidisciplinary teams working on focused areas. We had developers, testers, program managers, designers, marketers, and researchers all working together. And just within research we had ethnographers, product researchers, envisioning researchers, and MSR researchers all contributing. Microsoft has a history of having product visioning being driven by PM, and this was a refreshing reminder of how diversity helps to create better outcomes in many ways.
Reimagining research for Office/Microsoft
Along with a peer Design Research Manager, I've been driving a 'One Research' effort aimed at combining market research, HCI/product research, ethnographic/field work, and data analytics. We are pushing these different research specialties to combine methods within more holistic research projects.
For example, our market researchers are now combining their own ethnographic methods as well as telemetry with the quantitative survey projects that used to be done in isolation from those methods. Our product researchers are thinking about business goals, and our telemetry experts are injecting their expertise into a myriad of qualitative and quantitative research projects.
Learnings and Accomplishments
I'm proudest of how we are shifting our researchers away from incremental insights, as we drive them to take a step back from the iterative design insights that they have focused on in the past, and develop POVs about the larger business and experience goals that we should be aiming for.
As noted above, we've added data analytics to our arsenal of research capabilities, having it work with both market research and product research. With these combinations, we can validate other research methods with telemetry, while also using those other research methods to follow up on separate telemetry findings. At a high level, we can use telemetry to uncover the 'What?' while using other methods to get at the 'Why?'