Episode 01

Headshot of Lara Hogan

with Lara Hogan of Etsy

Etsy has been the model for cultivating a culture of performance within your organization for many years. In our first episode, we talk to Lara Hogan to find out what they do to ensure performance remains upfront and center.

Direct Download

Sponsors:

Build Right—The Build Right Maker Series brings relevant voices to the midwest, where you get the chance to intimately talk to industry leaders in a small workshop setting. The Maker Series sponsors down to earth conversations about real life experiences along with an anything goes interaction in the afternoon. Tickets for the 2015 series are available now!

Transcript:

Katie Kovalcin:

Welcome! You’re listening to episode one of the Path to Performance, a podcast for everyone dedicated to making websites faster. I’m your host, Katie Kovalcin.

Tim Kadlec:

And I am Tim Kadlec. Yeah, so this is episode one and today we’re going to be talking to Lara Hogan of Etsy. But since it is the first episode, we should probably explain to people what they’re getting themselves into. So Katie, why are we doing this?

Katie:

Well Tim, we’ve had several twitter discussions and emails talking about this very subject. You talk a lot about performance, I kind of talk a lot about performance and design, and we both agree that really a lot of what performance comes down to is an organization’s culture. So we thought it would be really cool to talk to companies that have performance as a really important part of their culture and see how they did it, get some really great case studies, data, and examples that everybody can learn from with the awesome resources and guests that we’ll have.

Tim:

Absolutely, and we’re targeting to do this two times a month. On the subject of the case studies—if you do have a great performance case study, maybe it’s a client project that you’ve worked on or you’ve successfully got a culture of performance set up in your organization, we would love to hear about that. If you could email us at hello@pathtoperf.com, we’d love to talk to you and see if it would be a great fit for the show.

Katie:

We also want to hear from you if you would like to sponsor an episode, or several. You can also email us at hello@pathtoperf.com. Speaking of sponsors, today’s episode is brought to you by Build Right.

We’re really excited to have our first episode sponsored by Build Right. Their workshops are designed to inspire and empower a better, stronger web. Build Right offers a whole slew of workshops and in-house training to empower everyone to build right. The Build Right Maker Series brings relevant voices to the midwest, where you get the chance to intimately talk to industry leaders in a small workshop setting. The Maker Series sponsors down to earth conversations about real life experiences along with an anything goes interaction in the afternoon. Attendees have an opportunity to spend the day learning from an industry leader and enjoy a lunch and happy hour together.

Tim, you’re actually giving a performance workshop at Maker Series this April, and a bunch of other great speakers are on the lineup, like Dan Mall, Pamela Pavliscak, and our friends Ethan Marcotte and Karen McGrane over at the RWD podcast are in it as well. There’s limited seating and a few early bird tickets for each workshop, so hurry up and go get your tickets today! All the details are at Buildright.io.

Tim:

Yeah, super happy to have them sponsoring, and I am excited to do that, it’ll be fun. So before we talk to Lara, we want to talk about a couple of things that came up recently in performance—we should come up with some funny music for this or something like that, something that makes it seem epic.

Katie:

[Laughs] So, a big thing: Tim, you recently released into the wild What Does My Site Cost. Can you talk a little bit about that?

Tim:

Sure, yeah. It’s Whatdoesmysitecost.com because I’m not very creative with domains. We’ve talked a lot about how performance is shifting to more experience-based metrics like speed index, or things like Twitter’s time to tweet, which is great. But the weight still matters, just in a little bit different of a way than I think we’re used to looking at it.

After a couple of conversations with other people about this, I realized that there were several of us that were really curious about how much it actually costs, from a monetary perspective, when somebody loads these fat sites over a mobile network. So, that’s what Whatdoesmysitecost.com does; it uses WebPageTest and looks at the weight of the site and figures out how much approximately that costs around the world based on data from the ITU.

Katie:

It’s a super awesome resource, especially for sharing that stuff with clients who may not understand that impact. I did it on my site and I think the most that it cost was 8 cents to people in Germany.

Tim:

Nice! Yeah, Germany is expensive. But I like the GNI comparison that shows it as a percentage of the daily gross national income. That one is the eye-opener for me.

Katie:

Have you seen a lot of traffic being used for it? Is that something people have started to work into their workflows?

Tim:

Actually, that’s what’s been fun. It went nuts when it first got launched, partly because it’s also integrated into WebPageTest. When you run a test through WebPageTest, you see little dollar signs as part of your test results that relate to how expensive it is, and then that links off. It’s been fun though to watch people passing around the results and kind of razzing each other over who is more expensive and whether or not so-and-so’s blog post is actually worth it. [Laughs]

Katie:

Another thing that has been circulating around is HTTP/2 and how that’s going to impact all of the performance things that we’ve done in the past and how that’s going to change.

Tim:

That’s a huge topic and there’s a lot we could talk about, and we probably will come back to that many times. We may even break our rule and get a little bit more technical once I’m talking about it. We’ll see. But yeah, we’re going to have to relearn a lot of the things that we’ve believed were performance-best practices for a long time now, because it completely shifts the way that the whole network is going to work. It’s a big deal. Recently Matt Wilcox wrote a general introduction—I would say it’s appropriate not just to developers but anybody who’s curious about what HTTP/2 means for them.

Katie:

Another cool thing, and it goes back to why we started this podcast, because it doesn’t really matter what technical things we have to do for HTTP/2—if we already have the culture that appreciates and fosters performance, being able to change up how we build sites should hopefully only be made easier if the company already values that. That’s how I look at it.

Tim:

I agree, that’s going to be a big thing. It’s important to note that it’s not something that we have to necessarily worry about in the future—it’s already here in many places. Recently Akamai talked about how they were just rolling it out for a lot of their customers by default, and it’s not even something where they have to pay extra or opt into, it’s just going to be there. So there’s already a few sites that they link to in their announcement post where you can see how it’s changing the behavior and things like that. It’s something that’s here and something that is a big shift, so we’re definitely going to need to deal with it.

Katie:

Do we want to touch a little bit on performance audits? We’ve seen some cool stuff on that recently too.

Tim:

We have. There’s a new site, Perfaudit.com. Have you looked at that yet?

Katie:

I have not. It’s on my reading list.

Tim:

It’s nice. I think at the time there’s only two or three short little bitesized audits of different websites online. It’s nice to be able to walk through the tools that they were using to do the audit, and some of the performance issues they’re coming across.

Another one recently that’s even more detailed is Paul Irish posted this super long and detailed performance audit of Time, Google Play, and CNET, which is not only interesting from the perspective of seeing an audit of that detail but seeing how Paul walked through identifying what the issues are and drilling down into them was really, really interesting stuff.

Katie:

I’m going to definitely check that out because that sounds like there’s really good information there. Is there anything else that we wanted to talk about that’s happened recently?

Tim:

There’s more stuff we could talk about but I think I would love to jump into our discussion with Lara. We would like to welcome her to the show. Lara, can you tell us a little bit more about who you are and where you work?

Lara Hogan:

Hey guys! Thank you so much for having me on the show, I could not be more excited. I do work at Etsy. I’m the senior engineering manager for performance, which means that my team helps all of the product design teams make all of their stuff as fast as possible.

Tim:

That was very succinct, I like it.

Lara:

[Laughs] That’s my elevator pitch. It’s funny, when I talk to my parents about it I say that I help make the website faster, or that I help other people make the website faster and that usually helps.

Tim:

Do you still get asked though to occasionally fix the router or uninstall various plugins?

Lara:

Absolutely. You know, I find that it’s easier for them to tell other people what I do now that I work at Etsy. I used to work at a DNS company and it was a little more difficult for them to explain to their friends.

Tim:

Ooh, yeah, that’s nuts and bolts. That’s tough.

Lara:

[Laughs]

Tim:

So, how did you get from the DNS company to Etsy?

Lara:

It’s actually an interesting story. I cared very deeply about performance and I was doing a lot of performance work at that company. I was blogging about it, I was tweeting about it, I was giving a bunch of presentations about it, and a guy here, who’s now the VP of engineering at Etsy, found some of my work and they reached out to me and wondered if I wanted to come work and do some cool leadership stuff, be a manager, which I actually really enjoy. Since then it’s been a really perfect partnership.

I came on initially to do mobile web for them, so they had a team of three of four developers and we worked on making the mobile website until we realized that that’s a responsibility that everybody should own, not just a select few engineers. Once we disbanded that team and helped everybody do that better, I became the senior engineering manager for performance.

Katie:

So bringing you on was largely a catalyst for starting this really strong performance culture that you have?

Lara:

Actually I wish that I could take credit for it—I would love to take credit for it—but it started long before I got here. [Laughs] It’s funny, when I talk about Etsy I always have to acknowledge the significant amount of privilege I have working here, where the CEO used to be a CTO, so he’s really performance-focused anyway.

This guy, Seth Walker, he initially started publishing quarterly performance reports, meaning on the Etsy blog he would say “Here’s how fast our pages are loading.” It’s a way to kind of give back to the community and help them understand that we were focused on this and trying to make it better. He really kickstarted our performance culture here at Etsy. That grew pretty significantly because what he noticed was once you start publishing about it, if engineers are reading it they’ll start to care about it too because there’s a little bit of pride that happens, and so it grew from there. He started the team which I then took over when I moved off of mobile web.

Tim:

Was that a rogue effort on Seth’s part? How did he get to the point where the company was like “Yeah, you can tell everybody how bad we’re doing in certain areas of performance,” like in the open?

Lara:

I wasn’t here for this, but the hyperbolic story goes that there was another engineer, who’s like a manager, a director, or somebody, who wanted to just publish numbers and it was really boring and not that helpful to anybody. It was just “Here’s some graphs!” or whatever. Seth took it on and said “No, we can do something with this. We can empower people to impact performance internally and we can help make this a community effort where we can show our sellers that we’re thinking about this.” So really, it’s all Seth’s brainchild, making it a real awesome thing.

Katie:

How does having a leader that really cares about this directly impact the entire organization? You said you’re privileged to work somewhere where a CEO really believes in it. How do you think other people can try to make their leaders care about it and have it be a really big part of what they do?

Lara:

I don’t try to lean on the CEO’s words as a crutch. If I wanted to, I could use it as a “Well Chad says performance is really important…” but I don’t do that because I think it’s important to remember that we all should care about this, whether or not a leader says it’s important—although it’s very helpful to have their buy-in for when you need to dedicate resources.

When it comes to not having that leadership support from the top down, there are a couple of things that I have found to work in my history of working in performance. One of those is to touch on that pride element that I find really, really works with very important people. They care about how we look versus how our competitors look. If you show a WebPageTest video of your site versus your competitor’s site and your competitor’s site is faster, oh man, that very important person is going to feel that and they’re going to hate that, and they’re going to want us to make it faster. I found that WebPageTest videos work for a couple of different reasons. Filmstrips are fine but waterfalls are really boring, and videos just make you feel how fast that site is loading and that empathy is really helpful.

You can also compare before and after videos of performance improvements. You can speed it up and you can show where your work went, where those human resources went. You could show how your site loads on desktop versus how it loads on a 3G-shaped connection, which is really painful. You could show how it loads close to your data center and far away from your data center to show the latency impact and how users around the world are experiencing it. Videos are definitely my number one thing to mention.

My number two thing to mention is you have to figure out what those very important people care about. Everybody thinks it’s just money, but that’s not actually the whole thing. As engineers we’re just like “Oh sure, everybody cares about conversation rate” or whatever that is, but actually it can get a lot more nuanced when you’re talking to very important people. Maybe it’s favoriting. Maybe it’s return rate. Maybe it’s churn. It can really depend on the person. So, if you’re able to tie performance metrics to those things, that will be a huge win for those people.

You can find comparable studies that have been published online. Google Maps had a great one about page weight and how decreasing page weight increased return visitors. There’s a whole bunch of stuff out there that people can rely on once they know what those metrics are that the very important people care about.

Katie:

Those videos that you just mentioned sound really awesome. In the past when I’ve been on projects where we tried to sell performance, we’ve just used page weight metrics with competitors. But these videos sound way more compelling. Can you talk about how you actually make them? It sounds like a really cool thing that a lot of people should be doing.

Lara:

When you talk about numbers, I find that numbers work really well with engineers. I can tell them that “you should aim for a speed index of under 1,000” and they’ll aim for it and they’ll care about that, but that doesn’t always work with designers and very important people that are not engineers and that aren’t technical. With the videos, I usually just go to WebPageTest and I choose a test location that’s far away from my data center and I choose a shaped 3G connection or some other kind of connection that’s slow, and you can just tick a box that says “Capture this as video.” It will export for you a video that you can download.

One of the fun things that I’ve done with this recently is we have a dashboard monitor that’s up on a wall—I should tweet out a picture of this—and we have comparison videos running side by side. I just downloaded those MPEG4s and I put them up next to each other on a blue background and I just said “One’s on cable and one’s on a 3G connection,” and you can just watch them, it refreshes and then you watch them again, and one is like three times as long as the other. It was really impactful because in the hallway you’re just walking by this monitor and it catches your eye and you realize “Oh my word, this is so slow!”

Katie:

That sounds really awesome. I definitely want to get one of those monitors set up at our office.

Lara:

It’s cool because you want to think about your users; performance, as we all know, is a huge part of the user experience. I have four videos, one from each of our major markets globally, and obviously the United States one loads really fast but then you get to Australia or Singapore and it’s just so slow. It’s really helpful to put the viewer in your user’s shoes so that they can realize “Okay, if I’m in Germany, this is how Etsy feels for me.”

Tim:

What I like about having the dashboards all over the place is that performance is one of those things that often gets overlooked because it can be transparent if we’re not looking for it, because we’re coming at it with super high-speed connections and what not. But if you’ve got those monitors all over the place proudly displaying, warts and all, the current performance of the site then it’s become something that you can’t ignore, it’s there all the time.

Lara:

We use it for a couple of different reasons. Those videos are just part of it, because obviously numbers and graphs are really helpful so we display some of those, it’s on a rotating dashboard. But my favorite part of it is this dashboard called Performance Hero, and we do it quarterly where we find this new Performance Hero, which is a person inside the company that impacted performance positively, and they’re not on the performance team, so it’s some other engineer or designer or somebody else who found this huge win on the site. Then we put a picture of the team and we’re as enthusiastic as possible in this possible, like “Yeah performance!!!” just to communicate exactly how excited we are about this win, with the person who won it and we have a little cross stitch performance logo that we hold up. We surround that picture with graphs of the performance win and a little description of what that person did.

The one that I just tweeted out this past week was Dan Miller, he implemented HHVM, which is an awesome Facebook thing, and we sped up our API by a whole bunch and it got us closer to our SLAs for our API calls. We wanted to celebrate that because he’s not on the performance team and we just want to make sure that it’s clear to everybody that “we notice when you’re doing good work and we really want to thank you and celebrate it.”

Tim:

So, this is the whole carrot versus the stick thing, right? I know that you’ve talked about the idea of performance cops and janitors before.

Lara:

Yeah, I feel very strongly about this. My career whole career here has been built off of taking the high road. I don’t think that punching down is ever the thing. I think it’s so important, when you’re trying to change a culture to care about something, to just work on the positive reinforcement. And there’s a ton of brain science behind this stuff; there’s a bunch of books about change management that support this. When you’re able to celebrate these things and, like you said, the carrot instead of the stick, this is the stuff that makes people actually want to do things more and that’s the stuff that has so much more longevity to it.

Katie:

On the topic of these people outside of development that get celebrated for their performance wins, I’m really curious as to what your design process is like and the steps that your designers take to make sure that they’re keeping on track with all of these ideals.

Lara:

Design is something that I’ve focused a lot on in the past year. Especially when writing the book Designing for Performance, I wanted to make performance as accessible as possible to people who frankly don’t need to know what TLS is, they don’t really need to know the ins and outs of how browsers render stuff. They should know some of it, but I was hoping to explain it in a way that is less intimidating because I find that this stuff can be intimidating—even to me, who’s an engineer.

From a designer point of view, at Etsy we’ve got a number of product teams made up of engineers and designers and usually a product manager, and those people are tasked with a specific area of the city. So maybe it’s seller tools, maybe it’s the buyer experience, maybe it’s the checkout team, and those people all go about their day working on making that experience better. They’ll do that using A/B tests and prototype groups in which there are some opt-in sellers or buyers who are helping us test it and giving us qualitative feedback. In general, they’ve got their own road map and they’re plugging away on it.

On the other side of the house, we’ve got these infrastructure teams, like mine, that want those product teams to do things well, we want them to implement best practices. You’ve got front-end infrastructure who wants to make sure our Sass stuff doesn’t get too bloated, you’ve got our data engineering team who wants to make sure that we’re collecting metrics for these experiments in a way that’s statistically significant, you’ve got an internationalization team that wants to make sure that every new design is global—we’re not just designing for a domestic United States audience.

So, it’s been funny to think about performance as everybody is trying to do things and trying to help people do things better. Imagine being a poor product designer: not only do you have to do your job but you also have to do all these other 11 team things well. Part of my job when it comes to working with designers is to continually do education. I think that performance is not a one-and-done thing. There’s constantly new things being learned and, frankly, we all forget this stuff. I don’t need my designers to remember why PNG’s are so much better at compressing things than GIF’s. I just need them to remember “Think about image formats and maybe look up my old presentation.” Then I’ll give my Designing for Performance presentation once a year to help them remember that this stuff is important.

Throughout the design process, they might be measuring performance in the beginning or at the end. It really depends on the team’s workflow. Everybody can do things as they go or after they start an experiment they can start to clean stuff up. So we’re really inserting ourselves as part of the process the entire way. Maybe it’s at the beginning, we can give them some tips on how to make something CSS3 instead of an image, maybe in the middle we take a look at it and find some opportunities to save some requests, or maybe at the end it’s an experiment and I notice that it’s a lot heavier, I’m going to come in and suggest some ways to improve it.

Tim:

You said that you’re inserting yourselves into the process. What does that look like exactly? Is it through the experiments? Are there check-in performance meetings? How does that work?

Lara:

That’s a good question. It really depends on the team. Some of the teams are really performance-focused and they have my heart and soul. Some of the other teams forget. I don’t blame them at all—they’ve got so many other things to think about, it makes sense that they’re going to forget about performance. My job is to make sure that we have monitoring and alerting in place, that way if something goes out then I know about it.

Let’s say someone pushes a change and the back-end time doubles. My whole team is going to get an alert that says what changed and where, it’s going to include some graphs so that we can see when it changed and if there were other things happening at the same time, did an experiment ramp up, etc. We get some context about it. Then we’re able to reach out to the team and say “Hey buddies, we noticed that this happened. What’s uh… What’s going on over there?” and we do it in a way where we offer help but make it clear that it’s on them to fix and that we’re just here to help point them in the right direction. We feel very, very strongly that the burden should not be on the performance engineer’s shoulders because that just leads to burnout.

Katie:

I’m a designer and, like you said, those more technical terms, the ones where you’re like “They don’t need to know this”—I don’t know what those terms are. But yeah, I’m trying to figure out ways that I can have those frequent check-ins with the people building it to offer suggestions to the design and I think that that’s hugely important to have that more technical input and advice as the design is going rather than trying to scramble and retrofit a performance into a design that’s never going to be fast.

Lara:

Totally. One of the other things that we do is the team built an add-on to what we call the Admin Toolbar. This is a little toolbar that sits at the top of every Etsy.com page when you’re logged in as a coworker of Etsy. It shows you a bunch of stuff; it shows you the A/B tests that are running, it shows you some pageview data, but it also shows you how long this page to load and it will be in red when it’s taking an abnormally long time. As people are designing and developing, we try to insert some things into the workflow to help make it easier to remind them about performance as part of the process. That doesn’t always work, of course. It’s easy to ignore. But we really try to make it as naturally part of the process as possible.

Tim:

That monitoring that you were talking about everybody getting in the report, does that happen post-going live or is that on a development server somewhere? Like those changes go first and then that’s where the monitoring kicks in and says “Wait, you shouldn’t push this out”?

Lara:

The question also has to do with how much we also do synthetic monitoring versus real user monitoring, so I’ll answer that in a couple of different ways. We have both synthetic and real user monitoring setups, so that means we’re measuring what our users are actually experiencing, how slow it is for them, but also we’ve got some more stable tests running that give us how things look over time.

The alerts we get are all for production, meaning they all, in some way, shape, or form are going to be user-facing. That could mean they’re user-facing to one percent of users, that could mean they’re user-facing only to people who work at Etsy, but in some way we’re gathering those real user monitoring metrics and then alerting on them. CatchPoint is one that we use to tell us how heavy a page is, maybe something increased an image significantly and we’ll want to know about it. But then we’ve got plenty of our own alerting using Nagios actually, and we’ve cleaned up the alerts to make them a little easier to read and to give us some more context once some other things slow down.

Tim:

You’re making heavy use of WebPageTest too, at least since last I heard, right?

Lara:

Yeah, historically we’ve used a private instance of WebPageTest, which helps us do a little bit more while things are developing. But recently, just to make sure things match up to our quarterly report which we still publish every quarter, we’re relying more heavily on things like CatchPoint.

I wanted to also mention that everything I’m talking about has to do with things getting slower and I’m noticing that historically over the past few months that that’s a bummer. If all of the alerts we get are just for things going run, that’s kind of sad. So, one of the things that we’re going to work on next is alerting when things get better too.

Tim:

Oh nice. So again, the carrot monitoring basically.

Lara:

Exactly. I want to thank somebody and be like “Yo, why is this CSS so much smaller! Who did that? Thank you!” Again, I only know about it when people tell me about it. Because we’re so focused on making sure that the site doesn’t get slower, it’s going to be really important for us to monitor when there are wins too.

Katie:

That brings up a question that I had for you. Do you continually try to increase and set faster goals for yourself, whether that is quarterly or whenever you’re trying to just make it a little bit faster day by day? Do you have any interesting metrics or data through this increasing that you can share with us?

Lara:

We don’t do this as often as I’d like. I know in the last year or so 800 milliseconds was our SLA, our service level agreement, from back-end load time. What we ended up doing was looking at a per page basis, and for the pages that were beating that, we set whatever it was at to its current SLA—it’s per page. So, if it said it was only taking 400 milliseconds to render back-end, we’re going to say “Okay, that’s new SLA for that one.” Unfortunately, more likely than not, because of the volume of experiments that we’re running, things are just going to get slower and not necessarily faster. Although I’d like to make sweeping faster improvements to the site, it’s just not something we’re able to do yet.

The other weird part of this is that mobile is slower and the more mobile users we have, the slower our real user monitoring data tells us we are, just because of the shift in the userbase and the way that data is collected. That’s actually a really interesting part of this, is “how much should I be accounting for the fact that it’s slower, so my SLA is going to be higher on mobile, and how much should I really start pushing people towards meeting the same desktop SLAs?”

Tim:

That’s a great question. Do you have separate SLAs in place for mobile and desktop right now?

Lara:

We do. And the other thing we’re developing right now are SLAs for native apps, which is… Oh man, where my brain is right now is just on native—we’re really heads down on it right now. For right now on the front-end for desktop, it’s kind of like page complete but front-end loading we would call it, is two seconds for desktop in the United States. For mobile, I think it’s four and a half seconds. As you go farther and farther away from the United States, it’s five and a half seconds and six and a half seconds for Australia and Asia for mobile.

Tim:

I saw you soliciting feedback on what metrics and stuff to monitor on native. Has anything come out of that? What are you looking at there?

Lara:

I could spend a whole three podcasts talking about this. [Laughs] It’s fascinating to me—again, this is like a privilege thing, I’m able to work at Etsy where they allow me to take time out of my day to think about native apps when very few, if any, other companies are thinking about this. We all know it’s important. Everybody has said “Yes, we know that native app performance, especially perceived performance, is important,” but no one has been able to dedicate the time to look at it because everybody is spending all their time ramping up on plain ‘ol how-to-do native. It’s a thing that we web developers are having to mentally switch our mental models for.

So, I put out this Gist in which I proposed a set of core metrics for native app performance monitoring. Things like “How long does it take for the app to load once you tap the icon? How long are you looking at spinners for per session or per screen?” Some way to show speed index of apps—I don’t even know how we’re going to try that. And we got a fascinating response, because again, a lot of people know it’s important but nobody has necessarily thought about that core list yet. Especially not vendors—vendors were clamoring for this list because they just want to know what to build for us.

Also, I was really thankful that Colt McAnlis at Google, who’s working on the Android team, reached out and offered a ton of help in developing this list and in helping us learn what kind of tools we need in the Android studio to measure this stuff, because what we’re finding right now is okay, we can get this stuff using Xcode or Android, but it’s not easy to gather this routinely, it’s not easy to automate these things, it’s not easy to get real user monitoring for these things. Even though we’re able to establish some baseline for these screens, really overall we’re in this brave new world of gathering performance metrics.

Tim:

I agree. It’s one of those areas where it feels like everybody is interested in it but that nobody has actually solved yet. One of the things you talked about is you targeting things like perceived performance not just in terms of how quickly you’re moving from page to page but from button press to app load and things like that. Going back to the web side of things, with your SLAs and the things you’re monitoring, are you watching things like rendering performance and things like that just as accurately?

Lara:

The answer unfortunately is no. We only know about them anecdotally because we haven’t instrumented anything to say “Our framerate is real buggy on this page.” We have had a number of experiments that showed us that that stuff is important. We reduced scrolling jank in our activity feed so when somebody logs in and they’ve got other friends who favorited stuff, there were some layers of box shadows that were triggering some scrolling jank and once we removed it—oh man, the business metrics skyrocketed, our favoriting rate went way up. It was amazing. So, it’s proven to us that this stuff is definitely important. It’s so hard to instrument around. You have to run automated tests that scrape it frame by frame and it’s not something we’ve been able to dedicate stuff to yet.

Katie:

Do you have any really interesting numbers off the top of your head to share, whether that’s business related, or speed, or just any of those really awesome data statistics that we get excited about?

Lara:

My favorite one from Etsy is we ran this experiment on mobile web where we added 160 kilobytes of hidden images, meaning the user saw nothing different, we just dumped a bunch of hidden images onto the page and increased page weight by 160 kilobytes. It triggered a 12% increase in bounce rate. Insane. Twelve percent is a lot of percent.

Tim:

[Laughs] It is. That’s nuts. Just as nuts though is that you’ve gotten to a point within Etsy where it’s okay for you to say “You know what, we’re just going to add 160 kilobytes of images to this page and see how bad this hurts things.”

Lara:

I cannot say the word “privilege” enough here. Like, I need to acknowledge my privilege because being able to work here and run these loony tunes experiments or, for another example, being able to say to everybody “Hey, in a few months we’re going to slow down everybody’s internet at the office so you can feel how things are going to really feel.” I don’t know another company, except for maybe Facebook, that would allow someone to do something so bananas.

Tim:

How long are you going to do that? Have you started already?

Lara:

Until someone threatens to fire me. [Laughs] No, we haven’t started to do that yet. What I did a year or so ago, when I was still the mobile web engineering manager, is we had a mobile web hack week in which we encouraged designers and developers to hack on mobile web stuff, and as part of our stick and maybe less of the carrot, was we forced everyone to look at mobile web templates rather than their desktop templates as they were developing. It actually helped. It helped people remember that there was this whole other part of the website that they were ignoring, and it helped our mobile web experience a little bit.

So, it’d probably be something like that, like a performance hack week, and then we can have a week of just really slow internet speeds.

Tim:

Where everyone works on dial-up.

Lara:

Mhm. That’d be amazing. Everybody would work from home that week, I can guarantee you.

Tim:

[Laughs]

Katie:

Do you think that there’s a way to try that testing of throwing hidden images and stuff on client projects and then come to the client and say “Hey, that request that you wanted that weighs this much, we tested it and there was this crazy bounce rate and all of this stuff”? How do you think people can try to do things like that with clients?

Lara:

I think it comes back to knowing what numbers are important to people. I’ve found that a lot of times you’re not going to be able to actually make an improvement, whether it’s because you don’t have the time or they need you on another project for something. So I find that as long as I can know what kind of number they’re looking for—maybe it’s return rate, maybe it’s conversions, maybe it’s the number of people who have filled out a form—if you can find comparable studies of people who have had significant wins, that, in my experience, has usually worked.

Obviously there are going to be clients who say “But that doesn’t apply to my users,” and then you’re going to have to pull out the big guns with some real feel stuff with the videos. It’s a human problem—all of these things are—and with getting them to care about this, you’re going to have to pull on the heartstrings and touch on their pride a little bit and make sure that they know that performance is a part of the user experience.

Tim:

Very true. That’s exactly why we wanted you as the first guest because I think that’s the whole point of this, is trying to focus on the fact that performance isn’t just something that is an engineering or technical issue. It’s so much more deeply rooted than that in most organizations. You guys have cultivated that culture very strongly within Etsy. You’ve mentioned experiments a million times already, so it’s safe to assume that it’s a very experimental culture where you’re okay with breaking things occasionally.

Lara:

Totally. That’s the way you learn, you know? The other nice part about working in a place where we have the tooling to do A/B tests is it means that whenever I want to test something like prefetching or a different kind of image format, I’m able to actually run that experiment and see the impact it has on our users.

Tim:

And that’s only because you’ve invested in building the tools and setting up the process that you’ve already done, right?

Lara:

Yeah, absolutely. And we’ve actually open sourced a lot of this stuff. One of the key things about performance and other parts of engineering at Etsy is that it’s very important to us to continue to give back. If I have this much privilege, I need to be using it for good. If I have these contacts at Google and elsewhere to help me build out native apps performance metrics, I need to be giving this information back to the community. So, we’ve open sourced a lot of these tools and we plan on open sourcing, or at least publishing, as much as we possibly can as we’re learning about native app performance.

Katie:

Is there a site that people can easily go to to get all of these resources?

Lara:

Absolutely. So, if you want to read up on our performance reports or any of the other engineering stuff that we do, you should go to Codeascraft.com. Then we have Github. If you find Etsy on there, you can see all of our open source projects.

Tim:

You know what else would be awesome is if there was a book. Could you recommend a book to perhaps read on this?

Lara:

[Laughs] Some have said that Designing for Performance, a book published by O’Reilly but it’s also available on Amazon, has helped them change their culture. There’s a whole chapter dedicated to changing culture at your organization. The other cool thing about this book is that all proceeds go to charity to help girls learn how to code. So, if you are invested in buying books maybe for your entire company, you can know that there are going to be a ton of girls out there who are also learning how to code.

Katie:

Awesome.

Tim:

That’s fantastic—very awesome. Well thank you very much, Lara. We appreciate it. It was fantastic chatting with you.

Katie:

Yeah, thank you so much.

Lara:

You got it. It was wonderful. Thanks guys.