Farewell, Vanessa!

I’m proud to announce that Vanessa Martínez is leaving us for the digital team at the LA Times. Joining us in mid-2016 as a producer, Vanessa started working as a developer with the Interactives Team this year and quickly became a valuable asset in our constant battle against deadlines. Among other things, she’s responsible for tracking Trump’s first 100 days, gave readers a 360 tour of the new Pike Place Market, and shouldered much of the load for the infamous Profanity Peak wolves.

She’ll probably be chagrined that I’m writing this, of course. Vanessa is one of those talented people who almost seem embarrassed by their own competence, and I know that she hoped (despite all indications to the contrary) to sneak out unnoticed. But she should have known that I couldn’t let her go back to her hometown without raising some kind of fuss.

As a producer, Vanessa quickly gained a rep for being capable and thoughtful, with great news judgement and technical dexterity. She’s also known for her sensitivity and watchfulness around delicate subjects, like our coverage of race and gender. She’s quiet, but there’s steel behind that friendly smile, and even in the rambunctious hub environment, people trust her.

A good web producer is a rare and wonderful thing. But about a year ago, Vanessa decided on her own to pick up more digital skills, by pitching and assembling projects with different teams in the newsroom. She started by providing data for interactive graphics, but quickly graduated to doing the development herself: a database of Seahawks injuries, the Trump tracker, and of course, our confidential news tips page. By April, she was sitting with us in the graphics department and teaming up to design big stories like the Mariners preview section and the Seattle mayor quiz. By July, we’d convinced her to eat lunch with us sometimes, and even to go home on time.

It’s always a bittersweet pleasure when someone like Vanessa leaves: we’re sorry to see her go, but happy we could play a part in her development (pardon the pun). She proved herself a valuable and beloved team member in the brief time she was here, taking on any and all challenges with a level head and nary a complaint. Congratulations and best of luck, Vanessa! We hope you’ll remember us fondly, probably while stuck in traffic, from LA.

Jaime from Broad City gives us a slow clap

Fifty shades of purple: mapping Seattle’s city council districts

One of the most important lessons I’ve learned as a news developer is that there’s no right way to build a data visualization. But there are plenty of wrong ways.

Reporters frequently come to us with story ideas and actual datasets, if we’re lucky. It’s usually up to us to figure out how to best visualize the data, and the design process often continues long after our development work has begun.

Such was the case for our recent Seattle city council district interactive. The project featured a choropleth map inviting readers to compare the city’s seven newly-created districts across a variety of demographic measures, including age, income, and ethnicity.

The initial map was fairly easy to set up. For each demographic category, we colored the districts on a single-hue progression ranging from white (indicating that zero percent of the district’s population fell into that category) to purple (representing the maximum percentage, i.e. the district with the highest number of residents falling into that category). Most districts ended up somewhere in the middle, as in these views of the racial distribution of Seattle’s white (left) and black (right) populations:

Distribution of Seattle's white population          Distribution of Seattle's black population

These maps succeeded in providing some insight into the city’s racial makeup. It was clear, for instance, that District 2 (Rainier Valley) housed the majority of Seattle’s black population, whereas the rest of the city was consistently white.

There’s a limit to what differences the human eye can distinguish, however, and we were concerned that we were drowning out our data in a sea of subtly different shades of purple. We experimented with bumping up the contrast — first by intensifying the saturation of the maximum value (left), and then by decreasing the saturation of the minimum value (right):

Distribution of Seattle's white population          Distribution of Seattle's white population

Neither of these solutions left us happy. Both seemed to be misrepresenting the data, either by overplaying larger numbers or underplaying smaller, but not insignificant, ones.

In retrospect, the main problem with our initial approach was that it limited us to a single-hue spectrum defined by absolute maximum and minimum values. We had to apply the same color rules to multiple demographic views, and that meant that small variances between districts (i.e. number of people who bike to work, ranging from 2 percent to 6 percent) appeared blown out of proportion, while larger variances appeared washed out.

We decided to return to our original color progression, with the addition of a legend and some new styling:

Distribution of Seattle's white population          Distribution of Seattle's black population

We were still concerned that readers were going to have to work too hard to translate the map’s colors into numbers. It was easy enough, for example, to see that District 2 had the lowest number of white people (31 percent of the district’s population), but it was much more difficult to see that District 6 (Ballard) had the highest percentage (83 percent) — 16 percent higher than the citywide average.

In the week before publication, we buckled down and made a series of significant changes. The result was a map based on a two-tone color progression, indicating how each district stacked up to the citywide average for each demographic. We also switched out the legend boxes with a gradient scale to make it more readable:          

Distribution of Seattle's white population          Distribution of Seattle's black population

This new approach addressed several of our concerns. Switching to a two-tone system made it much easier to identify small differences that fell just above or below average. Additionally, by centering the progression around an average value rather than scaling it from an absolute maximum, we were able to provide a more accessible, at-a-glance view of what the city looked like as a whole.

Reporters, editors, and developers all put their heads together to work out the best presentation for this map, and the final form didn’t materialize until fairly late in the process. Our efforts were well worth it. Fielding criticisms and suggestions at each stage of the design process allowed us to identify and slowly chip away at discrete problems, and resulted in a product that everyone was satisfied with.

Wanted: News Apps/Data Summer Intern

The Seattle Times is looking for a summer intern to work with the news apps team at the intersection of computer-assisted reporting and interactive projects. We write scrapers for web sites, make visualizations and work with the newsroom to turn reporting into dynamic online experiences. Interns will partner with developers, designers and reporters on several small projects, and gain familiarity with our cutting-edge tooling and workflow process.

Just in the past few months, we’ve exposed vaccination rates for Seattle schools, invited readers to share their Super Bowl predictions and mapped the many dogs of Seattle.

Applicants should have some prior programming experience (JavaScript preferred), and should know HTML/CSS. Knowledge of Excel and/or SQL is also valuable. The internship will run for ten weeks and is a paid position. Please apply through our job board, or contact for more information.

Only occasionally dysfunctional, always fun: Join developers Audrey Carlsen and Thomas Wilburn and digital editor Katrina Barlow at The Seattle Times.


Burying the lead: better journalism through iteration

“Kill your darlings” isn’t only good advice for print journalism. Developing a successful digital project requires ruthless editing, no matter how attached you may be to that perfect paragraph or clever piece of code. Our recent Loaded with lead series, on the dangerous contamination found at gun ranges throughout the country, is a perfect example. While the final design is striking, bold, and distinctly digital, we threw away a lot of work to reach that point. Today I’d like to show how we moved through various iterations of the project until we found something right for the piece.

When we first started putting together our online plans for “Loaded with lead,” we didn’t yet have a final version of the story, or a solid headline photo to serve as inspiration. I began by experimenting with a James Bond-like screen wipe, blooming out from a gun or target to reveal the headline. Even as rough prototypes, these concepts were underwhelming.

Click the background to play the animation

Once we had the photo that would be used for the story in print, I tried another approach. Using WebGL and a handmade depth map, I set up a shifting perspective effect, changing the viewpoint in response to the mouse position to let readers look down the contaminated shooting range. It was a neat trick, but it didn’t really have any relationship to the reporting, or the problem of lead contamination, so we dropped it.

Move the mouse to see the depth effect in browsers that support WebGL

At this point, art director Susan Jouflas and I started on a new concept for the design. One of the dangers of lead in gun ranges is that it’s ejected from the gun as airborne dust: from there, it’s inhaled by shooters, settles on nearby surfaces, and gets absorbed into clothing. How could we portray this pervasive contamination to readers in the browser? We spent a lot of time looking at the ways that dust is shown in film, such as the Emmy-winning title sequence for the BBC’s Great Expectations:

To produce a similar effect, I built a multi-layered particle system in WebGL. We spawned the particles from behind the headline, as though the words “loaded with lead” were themselves emitting poisonous dust. A canvas-based fallback meant that browsers without WebGL would still get a similar–if far less elaborate–display. By tweaking the balance of sizes and directions for the particles, we found ourselves with a pretty convincing simulation to place over the image. Alongside the airborne dust, we added a smear of grime that would accompany the user’s cursor (or finger, on a touchscreen), and created a treatment for the article’s pullquotes in which grime would accumulate in the corners of the quote box as the reader scrolled past.

Click to play the “grime” animation

In general, we liked the dust and grime, but it didn’t provoke the strong reaction we were hoping for, and some people who saw it were annoyed by the contrast between the white floating particles and the black accumulation. The only design element that our test readers really loved was the dirty fingerprints left on touchscreens, which gave the piece a gritty feel in keeping with the reporting.

With that in mind, I decided to try one more idea that had been kicking around in our discussions. Susan used chalk and watercolor to create a texture made of heavy, black dust, which would be swiped “onto” the screen in response to touch or cursor movements. As if in a wax-resist painting, the user’s trail of contamination would reveal the white headline text against the previously white background. Immediately, we knew we had something special. Testers loved the effect, and it had a strong visual identity that we could use throughout the story.

Lead web producer Katrina Barlow, who did much of the design and layout on this piece, ran with the concept and integrated the black texture into the pull quotes, replacing the accumulation effect. From there, it was natural to allow the quotes to respond to a user the same way that the title did, although we made sure that they started with the top line or so already revealed. The effect was strong enough that we even dropped the cursor smudge effects, although we kept the fingerprints on large touchscreens.

Throwing away those earlier prototypes was hard to do, but we wouldn’t have been nearly as happy with the final result if we hadn’t. This is a hard lesson to learn, especially for beginning developers, who are still learning their craft and are (rightly) attached to their hard-won code, but it’s ultimately just as important as tooling and experience. When building news apps, be willing to kill your darlings: you’ll be glad you did.

Introducing our news app template

Hi, I’m Thomas Wilburn, newsroom web developer here at The Seattle Times. I work with editors and reporters to tell stories on the web, ranging from data visualizations to custom news applications. One of my most important tools for putting out great projects under deadline pressure is our news app template.

Digital storytelling is not new at The Seattle Times — you only have to look to Sea Change or our election guides to see that — but it hasn’t had a consistent process for development. Some of our news apps were built in Django, some in WordPress, and others in notepad.exe, depending on the staff assigned and their mood at the time. When I joined the newsroom earlier this year, one of my goals was to create a standard platform for digital projects, generating static files for ease of maintenance and low-stress hosting.

The result is a scaffolding system built on top of Grunt and NodeJS for producing news applications with the absolute minimum of friction (editorial or technical). In addition to populating a project with boilerplate HTML, the project initialization process also sets up the following helpful features:

  • A local development server, with live reload integration
  • Lo-dash templates, with some useful built-in helper functions
  • LESS and JavaScript compilation via AMD modules
  • A “watch” server to run build tasks automatically whenever files change
  • Bower pre-configured to install client-side libraries in a standard location
  • The ability to pull data from CSV, JSON, or Google Docs
  • One-command publication to Amazon S3

In many ways, this list is similar to scaffolding solutions from other organizations, including the NPR app template and the Chicago Tribune’s Tarbell. However, being built on NodeJS, the Seattle Times template is a bit easier to set up, and runs on more diverse software (namely, Windows). As a result, it’s been easy to get our web producers working on the same stack that we use for our big projects.

Our experiences using this app scaffolding have been positive so far. Using this scaffolding, we can be up and running on a new project in minutes, and the common structure means that it’s easy to reuse components from one app to another. Fast deployment makes our development faster, and being able to pull from Google Docs makes it easier to bring in editors and reporters. If they can use a spreadsheet, they can edit our interactives. We’ve used it to power many of our online features this year, including “Where the Bidding Wars Are?” and our Oso Landslide timeline. It even runs our Seahawks fan map!

As big proponents of open-source software, our team believes this kind of slick development experience is just too cool to keep to ourselves. So we’ve made our scaffolding available on GitHub under a GPL license. There are a few Seattle Times-specific bits you’ll need to adapt if you use it for yourself, such as our ad and tracking code. But other than that, I think it could be useful for anyone building static sites — inside or outside of a newspaper. If you build something with it, we’d love to hear about it!