The power of feedback and metrics at LinkedIn with Rajon Tumbokon

Estimated reading time: 9 minutes

Table of Contents

Keeping an eye on customer pain points, and finding solutions that will address them, is the best way in which we can ensure that a product is relevant and useful for them. Here Rajon Tumbokon, who leads engineering on-boarding, training and development for engineers at LinkedIn, talks about what they learned from feedback and metrics that led to redesigning their new hire ‘bootcamp’. What follows is how they’ve now embraced a new, more hands-on, learning curriculum and a more iterative training design process.

Engineering Bootcamp

Rajon Tumbokon manages all the technical Learning & Development programs at LinkedIn that are tailored to their engineers, and the flagship course is Engineering Bootcamp.

The goal of the program is for new hires to learn how to develop at scale. It’s a very different process when working with 3,000 other engineers compared to working alone with just some basic systems in place, and the intention is to get people using the same tools, simplifying the process, and enabling career mobility.

“The challenge is how to keep engineers shipping code multiple times a day, without breaking everything,” says Tumbokon. “How do we shift the mindset or behavior of a person who is used to just developing themselves, or has a thought process already institutionalized from another large technology company — so that they are capable of developing within the broad infrastructure that we’ve built?”

The first Engineering Bootcamp was launched in 2015. At the time, the biggest need was for what LinkedIn calls the applications track, the back-end of software development. The company was growing but the curriculum staying the same. There was a need for, first, updating the technology that was being taught and, second, becoming more relevant to different types of tracks.

Looking at The Data

The first indication that the bootcamp wasn’t delivering as expected came from informal feedback.

“HR started coming to us with feedback from exit interviews that certain types of engineers weren’t happy with the training they got,” explains Tumbokon. “Once we started hearing those anecdotes, ‘water-cooler stories’, we wanted to validate that through data. So we did qualitative focus groups of engineers joining the company, as well as looking at all the survey feedback from the end of the program, and again 30 days into their jobs.”

To do this, the team looked at the Net Promoter Score, NPS. Usually used to evaluate customer loyalty, capturing an NPS internally might be controversial; but this is where Tumbokon found that it was most useful.

The NPS gives you customers that are detractors (with a score of 6 or below out of 10), passives (7–8), and promoters (9–10); Tumbokon’s team looked only at the detractors, filtering the data and therefore saving a lot of time. They looked at all the open-ended feedback, segmenting the first week and the first 30 days, and this allowed them to see why those detractors weren’t getting value out of the bootcamp.

The data revealed that the detractors were all coming from specific tracks, for whom the content was not particularly relevant. Although the rest of the engineering organization did find some value in the applications program, they were beginning to consider creating their own customized programs. The training team noticed this gap and wanted to be relevant to all software engineers.

Customized, Hands-On Learning

To find a solution, Tumbokon and his team — consisting also of the course designer and two instructors — analyzed the data, talked directly to all the new engineers and managers, and started brainstorming. They created a large committee of ten engineers with specific domain expertise and a passion for improving training. It became clear that there were significant content gaps, and the committee was helpful in both proposing solutions and in contributing content for the workshops.

The result was an updated course program that worked for all tracks. The bootcamp still runs over four days, but while the mornings consist of general content, the afternoons now involve mentor-led workshops based on specific tracks.

“One thing I really believe in is the 70–20–10: people learn best if they have 10% formal learning, 20% with a mentor, and 70% hands-on training. We wanted to create a microcosm of that within the bootcamp, and so most of the learning is through doing.”

“The goal is now not only to learn how we do things at LinkedIn but also to develop your first app on the LinkedIn technology stack; so by the end of the program, you will have already developed something.”

The Challenge

Tumbokon, a non-engineer, helps to define the learning goals and makes sure that all the content and the activities they do in the workshops stay within the scope of those learning goals.

“The hardest thing was to define the scope of what should be included in Engineering Bootcamp; any subject matter expert is full of knowledge and will want to cram everything in, all at once. The program needed to be relevant horizontally for everyone, and it needed to be relevant for them within their first 30 days — those were the constraints and parameters that we put around ourselves.”

There was also some resistance around the new ‘learning-by-doing’ approach.

“For some reason, engineers expect training and education to come in the form of a lecture. ‘Show me the content! And I’ll absorb it.’”

“Some of the concerns or hesitation we’ve seen have also been about frustration. You shouldn’t be scared about your learners being frustrated, because that is the real world; it’s just part of the learning process.”

Measuring Success

The results of the first week survey — which aligns to level 1 of the Kirkpatrick Model, the worldwide standard for evaluating the effectiveness of training — have been above and beyond the expectations of Tumbokon and his team. The NPS has gone from 50 up to 80; and the SREs (site reliability engineers) who have gone through the new boot camp are now 100% promoters.

For now, Tumbokon and his team can only measure the effectiveness of the new training format at the first week stage. Because the revamped program is still new, they are waiting to have a critical mass of responses before analyzing the 30-day survey results (Level 3 of Kirkpatrick — Level 2 is seen to correspond to the demo day, where engineers will demonstrate what they have learned and teach that back to the class), and it’s still unclear how to measure success at Level 4 for the new tracks, some of which will require different metrics than the applications track.

In terms of the applications track of the initial iteration of the program, however, there are already clear improvements. Before the bootcamp was introduced in 2015, the average of new application developers weren’t even committing their first code base until beyond their first month — they couldn’t pass peer review, and they didn’t understand how the build systems and integrations systems worked; with the introduction of the bootcamp program, the developers are shipping within two weeks or less.

Applying The New Model

Now, thanks to its success, the new course design is fast becoming a model for other programs that are being created post-boot camp. A ‘flipped classroom’ model is being implemented for all large programs currently in development.

“Instead of wasting an engineer’s time in class with a lecture, they can watch a video as required pre-work and then the class is exactly what we do in bootcamp: Check out the source code, try to build it — it’s going to fail, because that’s the way that we designed it — and then they can learn, and go from there.”

The process of developing the training content has also evolved. The team’s process is influenced heavily by the rapid prototyping model. Instead of developing the whole course at once, the team now prototypes each activity, one at a time, gaining insights along the way.

“For example, we designed one activity — we created the code base for it, and we prototyped it with two or three engineers from different experience levels; and we found out that it was way too difficult! It also surfaced content gaps that weren’t in the video pre-work. So, we learned that we could lower the level of the challenge a little, and add some more instructional content to fill the gaps to better prepare the engineers before the class. This is an approach that I highly recommend.”


Learn what the most innovative companies are doing. Join one of the upcoming webinars.