Tag: Portland

Write the Docs: Susan Salituro – From docs to engineering and back again

I’m at Write the Docs today in Portland and will be post­ing notes from ses­sions through­out the day. These are all posted right after a talk fin­ishes so they’re rough around the edges.

Susan is a software engineer at Pixar and has also had a long career as a technical writer. She was inspired by last year’s conference and the talks that came out of it.

Her talk is about a story, a journey of career choices and being open to opportunity. She quoted Tom Freston who said, “A career path is rarely a path at all. A more interesting life is usually a more crooked, winding path of missteps, luck, and vigorous work.”

In college Susan wasn’t sure what she wanted to do. She love astronomy, though, and wanted to be an astrophysicist; the problem was that she didn’t truly believe she could finish a physics major. That dream was scratched. She had half the credits toward a math major, though, and pursued that. That switch left her with some free time which she devoted to English literature.

That combination led her to take a technical writing course. It opened up her eyes to the profession of technical writing. At the time she felt that was where she was destined to go. For a few years she followed that path. The community, conferences, passion, learning, flexibility all drew her to the industry. Susan enjoyed learning what customers did with a product. It was about the customer and making their lives easier. But, she started to wonder if there was something more, something deeper.

That led her to dig in to what she found interesting in technical writing and where her passion lay within that. Susan got really in to what made software tick. She’d spent so long understanding how programs worked from a user standpoint. Now she started learning more programming and digging in behind the scenes. She spent the next few years doing API technical writing.

Then, she made her way to Pixar. The manager there wanted to use DITA for a new project. Through that she got the chance to learn Python to write doc scripts and make to execute those scripts. While she never thought of herself as a programmer she had accumulated enough skill to enter that world and make programming her own.

The next challenge came in moving to the software release engineering team at Pixar. This pushed her to an even steeper learning curve where she was learning Perl and deeper levels of Python. The community she’d found in technical writing was still there, but it was mostly internal to Pixar. The mentoring and help came from inside the company but Susan didn’t get the sense of a large external community around these skills. The flexibility she found in tech writing also diminished. On-call hours were expected and work came at all times.

After moving to Canada, Susan changed companies and shifted to an information architecture role at a smaller company. She was now working solo without a strong internal community but was able to stretch her reach in to new roles that had thriving external communities. Unfortunately the company went bankrupt and Susan went back to Pixar as a Software Infrastructure Engineer.

On one level this new role meant reaching her goal of melding documentation and programming. They mandate was to find solutions for doc problems and address tools and process issues. The focus was not just documentation but the entire software life cycle. She worked on tools for the release engineering team, the quality assurance group, and more. At heart she still considered herself a technical writer.

Write the Docs: Drew Jaynes – Putting the (docs) Cart Before the (standards) Horse

I’m at Write the Docs today in Portland and will be post­ing notes from ses­sions through­out the day. These are all posted right after a talk fin­ishes so they’re rough around the edges.

Drew opened the second day of talks Tuesday. He works as a web engineer at 10up and contributes to the WordPress project as well. His talk focused on WordPress’ approach to inline documentation.

Up until about 8 months ago WordPress paid very little attention to the details of inline docs. There were more than 10 years of loosely following phpDoc 1.0. The codex, a wiki-based documentation site, was still seen as the main entry point to learning more.

This wasn’t working anymore with hundreds of thousands lines of code. The codex had grown to 2,300+ pages of manually curated content. That means each release saw the docs team manually going through the codex and trying to create pre-release master lists of required updates in the docs.

Enter hook docs. The docs team decided to create their own doc spec, largely inspired by existing frameworks. They were expecting a large influx of contributors to docs and wanted to set a consistent baseline for new people to the project.

The evolution of the docs team saw them establish themselves as a more official contributor group. They did an informal summit at the Open Help Conference last summer, ran a codex survey to see how people used it, and developed a roadmap for documentation going forward. Never underestimate the power of a docs team armed with a roadmap.

As soon as that roadmap was in place the docs team was off to the races. The top item was to burn the codex with the fire of a thousand suns; but first they needed something to replace it. Work began on a parser and code reference theme and hook docs began in earnest.

At this point the docs team has 3-5 sub teams contributing to various aspects of the docs roadmap. There are weekly chats and virtual sprints over the weekends. Furthermore, the collaboration with other teams has raised the profile of docs within the project. 8 months in to this project they’ve documented every hook in the project, more than 2,200 of them. There was a 48% increase in inline documentation over just 3 releases spanning one year. Those same 3 releases saw 40 new contributors to docs.

The biggest difference is that there are now people paying attention to the details of documentation. It ensures things are consistent, complete, and reviewed. Developing that standard for documentation was immensely helpful. Part of that was taking the docs created over 10 years and developing that standard from the docs. In the long-term this allows them to adopt new aspects in to the spec without causing vast disruption each time.

Write the Docs: Christina Elmore – Death by Documentation

I’m at Write the Docs today in Portland and will be post­ing notes from ses­sions through­out the day. These are all posted right after a talk fin­ishes so they’re rough around the edges.

Christina wrapped up the first day of talks. She opened by saying how she regretted the title for her talk, because her thinking has evolved.

She started her talk speaking about poor presentations. Most people blame PowerPoint for creating bad presentations. What Christina believed, though, was that presentations were trying to document themselves. This is what led to the wall of text on a slide. People were conflating the goals of documentation and presentation.

What’s the difference between the two? First, documentation is self-guided and meant to be read. Presentations are presenter-guided and meant to be heard and watched. With these differences in place Christina proposed a principle, good presentation slides are bad documents. The differences above are key and important. The way our brain processes the information in each differently.

People raise questions about that principle, though.

  1. How will I remember?
  2. What will I share?
  3. What if I’m not there to explain it?

The first question gets at questioning who really matters. You, as the presenter, are not who matters. Your knowledge and memory from the presentation are not central. Its that of the audience. To remember, create a story structure. This benefits both your audience and your working memory. Rehearse your presentation to create retention. These both work for your audience and you.

Matt Haughey’s guide to better presentations covers the question. Ultimately the content of your slides should not be the key thing you are trying to share. As Matt wrote,

The one thing you might notice in my advice and previous slides is that they don’t really make much sense outside of my talk. I’m ok with that, since I hope people get to attend the talk or see video of the talk.

The third question asks if you should be creating a document or a presentation. If you require those walls of text then perhaps the information is not best explained in a presentation. Sometimes the best deliverable is not a slide but a report or a text-based document.

What’s at stake here is that our documents suffer as well when we don’t understand the difference between documents and presentations. To start you can, well, document the difference between these formats of information. One way to look at this is to distinguish between a presentation and a conversation. They aren’t the same thing.

Christina ended with a bold proposal: work to eliminate the presentation within your organization. Amazon and LinkedIn have done this; they have meetings begin with 10 minutes of silence where everyone reads over the same document. Then they discuss things in a true conversation. By doing this you can ensure presentations happen only when necessary. You can then nurture a healthy culture of presentation.

Write the Docs: Amelia Abreu – Data, Documentation and Memory

I’m at Write the Docs today in Portland and will be post­ing notes from ses­sions through­out the day. These are all posted right after a talk fin­ishes so they’re rough around the edges.

Amelia talked about some of the theories behind documentation. She referred to it as a decidedly non-practical talk. Amelia started as an archivist.

The first question she looked at is, what is it that we document? Part of what we do as documentarians is actually looking for what is not documented. At times that’s just as important as what is documented. In her work as an archivist and ethnographer part of what Amelia did was look at that which happened off the record. You explore those areas that are hidden in the official record and poke at whether they can be defined.

Amelia then talked about movements during the early and mid-1900s in France. These were the communities that created microfilm and other standards of archival even during the disruption and oppression of World War II. While we’re in a far different circumstance than 1950s France we can still learn a lot from the work done during these periods. Some of the French researchers believed that shipping microfilm copies of great works of text around the world would help lead toward global peace.

Metadata is at the core of our institutional knowledge. Amelia used the example of Amazon which sold books at close to cost as a means to gather knowledge about the shopping habits of affluent customers.

As documentarians we are the providers of information architecture and infrastructure. Thinking about infrastructure need not be limited to human versus technological components. It can also be thought of in terms of interrelated social, organizational, and technical components or systems. We also must recognize human work as infrastructure. Infrastructures are often used in such a way that exhausts the best intentions of those who created them. Documentation is no exception to this process.

Write the Docs: Maxwell Hoffmann – Say more with less: writing for a global audience

I’m at Write the Docs today in Portland and will be post­ing notes from ses­sions through­out the day. These are all posted right after a talk fin­ishes so they’re rough around the edges.

Maxwell works at Adobe as a product evangelist. He’s spent 12 years in translation and localization with more than 20 years working in traditional technical documentation.

We’ve come full circle in terms of our hand-held devices. We’ve gone from cuneiform tablets to scrolls to desktop computers and now back to hand-held smartphones. This means our content must be in shorter pieces, faster to load, and easier to scan.

The biggest barrier to good translation in Maxwell’s mind is poor English. The more complex the language, the harder the translation. Simplifying your docs and your language simplifies your translation. For hand-held devices this helps as simplified English by as much as 30%. We can all benefit from writing simplified English.

Maxwell covered the 10 rules for writing ASD-STE 100 simplified English. This has 10 key values:

  1. Write positive statements.
  2. Write short sentences.
  3. Use one topic per sentence.
  4. Use valid nomenclature.
  5. Write simple sentences.
  6. Use the active voice.
  7. Do not use gerunds (-ing words).
  8. Avoid conditional tenses.
  9. Avoid word clusters.
  10. Use valid nomenclature & abbreviations.

To work on these rules try writing an outline on a typewriter. With this you can only write; there are no distractions. There’s no editing or formatting. You will feel and write differently. Or, if typewriters aren’t you thing, you can break long sentence writing patterns by dictating your writing.

Write the Docs: Simeon Franklin & Marko Gargenta – TechDocs at Twitter: Creating the Culture of Documentation

I’m at Write the Docs today in Portland and will be post­ing notes from ses­sions through­out the day. These are all posted right after a talk fin­ishes so they’re rough around the edges.

Marko and Simeon both work at Twitter on documentation projects. About 9 months ago Twitter acquired a software training company Marko and Simeon both worked at. The goal was to create Twitter University for internal training and learning. It didn’t take long, though, for someone to suggest they tackle the technical documentation as well.

The technical documentation they work on at Twitter is internal-facing. It’s engineer-to-engineer documentation and not the public API specs. They ran a survey about the internal docs. Using the net promoter score standard they found that most people hated the internal docs. Some common complaints were that the docs were incomplete, deprecated, non-existant, or unclear.

Their technical docs team was just 3 doc writers. They view the team as the engineering grease internal to the company. The goal is to make things run smoother. To do this they’re making docs like code. The process they’re working toward is to code, test, doc, and then ship.

They’re taking lessons from the test-driven-development shift to bring docs in to that process. Developers at Twitter have gotten accustomed to needing tests finished before they commit code. The challenge is to make that cultural assumption the same around docs.

The plan is three-pronged. The platform is DocBird, the publishing platform. Above that lie self-service tools and templates. Finally there’s DocDay, an internal doc sprint that aims to promote documentation internally. As they got started they realized there was no coherent plan or standard to docs. Different teams used different tools, formats, and processes. Some teams were even using Google Docs. Their internal wiki has more than 60,000 pages across more than 150 spaces.

In treating their docs as code they’re checking the docs in to git, those docs are plain text, and they’re regularly reviewed and merged. This is where technology comes in to the rescue. They built a homebrew documentation platform they call DocBird. They hit a few roadblocks in this process. The fact that DocBird is not a wiki, isn’t ReStructuredText, and isn’t generated from the source code were all initial problems.

Since the docs live in the code repo they’re able to track the amount docs drift away from code. The goal is to keep things in sync, but it’s a challenge.

The template layer of their project is aimed to enabling developers to get to minimum viable documentation as fast as possible. They’re working to enforce consistency in documentation through those same templates; things like a similar language are important here. It solves the problem discussed in an earlier talk about how intimidating a blank page is. By filling some default language in to the template they help developers get over that initial hurdle.

In terms of coverage the goal is to have 100% of known projects covered with the same docs. They’re bootstrapping this by starting with readme files that point to all the docs that exist. They don’t all have to be in DocBird right away.

With only 3 people using resources well is key. They’re pushing education as a way to leverage those three documentarians to have maximum impact. By working on templates, for example, 3 documentarians can influence hundreds of docs. Ultimately better technology alone will not create a culture of documentation.

The key problem is getting Twitter’s thousand+ engineers to write docs. On the one hand they’re telling the engineers that a lack of docs should be considered technical debt. On the other, though, they’re telling the engineers that if they want other teams to adopt their library they need to package it attractively; docs are a key part of that package.

Write the Docs: Kenneth Reitz – Documentation at Scale

I’m at Write the Docs today in Portland and will be post­ing notes from ses­sions through­out the day. These are all posted right after a talk fin­ishes so they’re rough around the edges.

Kenneth wrapped up the set of talks before the afternoon break. He works for Heroko, a web application deployment company. He’s also a member of the Python Software Foundation and an active member of the community. He seriously contributes to about 18 projects and has around 100 little experiments.

His talk focused on language, and how it works within documentation. Spoken language is what allows us to express ideas to one another. Our written language abilities then allow for us to express those ideas over a historical time scale. As we’ve developed these skills the human hardware hasn’t changed, but the software’s been upgraded.

As we’ve evolved our ability to communicate with one another has expanded from one-to-one conversations to one-to-many. The printing press, for one, gave the ability for a single entity to communicate to an extremely large mass of people. This continued with newspapers, books, television, and radio. This formed the narrative of “the public.”

The next step in communication is many-to-many. This format is inherent to the internet. If you have access to the internet you have access to a universe of information and ideas. The implications of this shift are huge.

Documentation doesn’t have to be about a software workflow or open source project – it can be used to convey ideas much larger than yourself. Its information architecture is inherent and a powerful tool for developing ideas over time. Finally, it enables us to develop ideas on a longer timeline and with more people than we’d ever be able to do by ourselves.

Kenneth works on The Hitchhiker’s Guide to Python, an open source project that’s outline and curated through a combination of Github and Read the Docs.

Write the Docs: Lois Patterson – What Makes Good API Documentation: An Example-Based Approach

I’m at Write the Docs today in Portland and will be post­ing notes from ses­sions through­out the day. These are all posted right after a talk fin­ishes so they’re rough around the edges.

Lois got interested in this subject through working on an API product she had to document. The company she works for has a very complex project that they were building an API for. They saw the API as a means for letting developers to narrow the scope of functionality available to certain user bases.

To start on this project Lois researched other good API doc sets on the web. What she found was that if you have a good API the docs are quite likely to be good as well. Twitter, Github, Facebook, and Flickr were cited as examples here. Why are these good APIs? First, they segment documentation in to buckets. Twitter has API documentation broken in to tweet, user, and other specific aspects. Similarly Flickr’s API Garden gives you concrete examples of how other developers have used the API.

While there aren’t inherent API standards those APIs which become extremely popular become de facto standards. The ProgrammableWeb service is a good resource for searching through various public web APIs.

Swagger and Atlassian’s REST browser are handy tools for exploring and testing the content within an API.

Lois argued that any good API documentation should have good descriptions, tutorials, code examples (both snippets and working apps), a sandbox test environment, as well as FAQs. At Lois’ company someone tried to argue that code examples weren’t needed because, “any good developer can figure out how to make a working application.” While they may be technically able to figure it out the job of your API docs is to make that process easier. Clear versioning for the API as well as suggestions on using the API are also helpful.

A first pass on API docs can be as simple as taking the spec and turning it in to a user-facing doc. Taking that to the developers then gives you feedback to go from and get a set of points to clarify.

Write the Docs: Matthew Lyon – Minimum Viable Documentation

I’m at Write the Docs today in Portland and will be post­ing notes from ses­sions through­out the day. These are all posted right after a talk fin­ishes so they’re rough around the edges.

Matthew talked about how the concept of a minimum viable product would apply to documentation. The minimum viable product, MVP, is the belief that building software is expensive and risky. So sometimes the best solution is to build the bare bones version first and then rapidly iterate.

You want to test the feasibility of an idea before sinking more resources in to it. Even if you’re just spending your time there’s the opportunity cost of building software. By building as little as possible you can launch faster. Unfortunately this frequently means documentation is cut. This, in some cases, is because product managers don’t understand the value that documentation provides.

You can do early-stage testing with early adopters. However, if your audience is mass-market they may not be the best test audience. Matthew’s example here was, “How many software developers do you know that use Pinterest? What if they used tech-based early adopters as their test audience?”

The product you create must work better than existing solutions. This applies to the product, documentation, and the tools you use to create both. Sometimes printing out a form, filling it out, and faxing it back is easier than dealing with an Adobe browser plugin.

Unfortunately lots of people think of documentation as a safety net for your user experience. This isn’t necessarily a healthy relationship.

Interfaces are, at their base, about manipulating abstractions. Without a strong grasp on that iconography you’re going to have a difficult time using that product. This is a problem, though, that technical writers are well-placed to deal with. They’re frequently in a good place to help understand how users will interpret what you are saying.

One way to help with documenting your product is to pair with new developers as they come on board. You can document the difficult things as they have questions. This lets you save time in the long run as common stumbling blocks are removed and documented. Reference docs, in this sense, are a form of outboard memory. They’re primarily for people who understand how something works and are looking to refresh their memory. While reference docs are key for APIs and other elements they aren’t the best teaching tools.

To write the best teaching doc you need to, on a certain level, forget what it’s like to be an expert user of your tool. Personas can help with this as they help you come at a problem from a given perspective. The how-to document is, Matthew argues, the biggest impact for getting a user from zero to hero. It’s a tried and true documentation format but it works. How-to docs work with user experience design to effectively document the experience of using your product. If the process for doing something is complex to write then it’s likely too complex to use as well.

You can also use your how-to docs to regression test your user experience. When developing software you’re going to break something. Having how-to docs in place helps communicate how the software is supposed to work. You can catch bugs when there’s a deviation from that canonical how-to explanation.

There’s also a way for writers to be proactive in a lean development process. As Matthew put it, what if you thoroughly prototyped your user experience in words first? It’s an author’s equivalent of test-drive-development. Combining that approach with some UX design knowledge gives you the opportunity to truly be a leader.

Write the Docs: Geoffrey Grosenbach – Graphical Explanations

I’m at Write the Docs today in Portland and will be post­ing notes from ses­sions through­out the day. These are all posted right after a talk fin­ishes so they’re rough around the edges.

Geoffrey is the VP of Open Source Content at Pluralsight, a video tutorial company. He opened his talk speaking about Information Anxiety and how the book was written for someone to skim, flip, or read through. Each is a valid means for consuming information. Could we apply this approach to documentation?

Graphical elements allow your documentation to be consumed at different rates and in different flows. It can be more respective of your reader’s time. As part of this Geoffrey recommended reading Information Dashboard Design, which looks like a fascinating book.

One way that he’s used graphical explanations is through type. Yes, it’s text but there is plenty we can learn just from type. When the web started it was designed as a textual delivery system; graphical requirements weren’t necessarily considered. Many of our documentation systems still take this approach. A simple pull quote can be a graphical hook that lets someone skimming a document understand the core ideas. Combinations of font size and color also convey graphical meaning. Think of a stop sign or a warning sign as two main examples.

Icons are the next graphical explanation Geoffrey covered. One of his favorites is Symbolicons, an icon font that looks like a neat tool for easily adding vector icons to your text.

Beyond that the power of graphics is not just that we’re making pictures on the screen. The power lies in the explanation behind that graphic and its ability to communicate the meaning within a document.

There are models for conceptualizing these explanations. Geoffrey walked us through examples of good/bad, good/better/best, before/after, and other core comparisons we understand as explanations. Many involve showing a reader the different types side by side. It helps understand how something can be transformed or what an ideal result is.