Tag: conference

Write the Docs: Christopher Kelleher – Make Music Not Noise

I’m at Write the Docs today in Portland and will be post­ing notes from ses­sions through­out the day. These are all posted right after a talk fin­ishes so they’re rough around the edges.

Christopher wrapped up the first group of talks before the morning break. His first boss taught him that making sound is not the same as making sense. That concept evolved in his mind to, information is not the same as communication. The standards of music can guide us here.

Since that talk with his first boss Christopher has worked on dozens of types of docs. The metaphor or information and communication has helped him throughout.

The music world defines noise as sound without structure. When you take sound and place it within a satisfying structure you get music. The search for this satisfying structure is innate. When placed in an MRI and given unfamiliar music the intellectual centers of people’s brain light up. They immediately try to make sense of the sound and determine its patterns.

With documentation we tend to accept docs that are failures in the same way as unguided and unstructured noise. But we lack the same fundamental cognitive abilities to identify them. When a document doesn’t communicate it fails. Just like when sound doesn’t exist within a satisfying structure. The likelihood that a document without communication conveys knowledge is extremely low. Ask yourself whether the doc you’re creating contains the necessary information.

Next Christopher talked about deliberate noise. Many times noise is accidental, but sometimes it serves a purpose. The most common lie on the internet is, “I have read and agreed to the terms of service.” No meaningful attempt at communication resembles a terms of service. iTunes, for example, has a 29 page long terms of service.

It’s goal is not communication. It’s goal is to protect the legal interests of Apple and to force you to click “I agree” without reading. This is often what noise in documents does; it gets you to acquiesce.

Deliberate noise comes in to play frequently in the legal and government arenas. Documentation noise is treated as unavoidable. But we totally can avoid it! A musical document succeeds by feeding your brain what it craves; it satisfies the innate needs. This music-like satisfaction is more than a nice-to-have, it’s essential.

Write the Docs: James Pearson – Don’t Write Documentation

I’m at Write the Docs today in Portland and will be post­ing notes from ses­sions through­out the day. These are all posted right after a talk fin­ishes so they’re rough around the edges.

James isn’t a writer by trade but is rather a system administrator. For him documentation is a way of ensuring that everything works and our users are happy. He works at iFixIt where they teach everyone to repair the products they use every day. The documentation on iFixIt’s site is all wiki-based and very graphics-heavy. About half the guides are user-contributed.

As James said, good documentation’s worst enemy is bad documentation. Not everyone loves writing documentation and when the love isn’t there the quality can suffer. There’s something different in our brains about docs. We don’t treat them like software and, as a culture, are sometimes numb to documentation. We assume it’s bad and avoid reading it. If we can focus on increasing the quality of docs we can combat this numbing effect.

When you allow bad documentation to linger it drags the quality of the overall library down. A similar effect to the broken windows theory. Bad documentations allows for people to internalize the notion that it’s okay to have crappy documentation. That’s not what we want; we want high standards for quality. As an aside, outdated docs fall in to this bucket as well. Updating them is frequently an easy thing to fix.

The first category that James really, really hates are auto-generated docs. There are three great virtues of a programmer: laziness, impatience, and hubris. Laziness is what drives us to create auto-generated docs. Frequently the result is not worthwhile, not in-depth, and not reviewed with an eye toward user-facing clarity. Javadoc may be a good tool but it’s gone bad. Just because you can do something with a tool doesn’t mean you should. Computers, at their core, suck at writing docs for humans to read.

James then covered the second category of bad docs, those that use overly verbose language. By knowing your audience you can find places to omit needless words. One audience may require great detail while another can do with just an overview. ssh allows for 4 levels of verbosity within the tool itself. What James seeks is a documentation equivalent of that.

The final thing James talked about was mistake-proofing. Humans will always make mistakes, but mistakes don’t always need to become defects in the final product. An engineer at Toyota came up with the concept of deliberately designing mistakes in to your product creation process. The trick is to fix these product failures rather than documenting around them.

After the talk James posted his slides. Check them out.

Write the Docs: Susan Salituro – From docs to engineering and back again

I’m at Write the Docs today in Portland and will be post­ing notes from ses­sions through­out the day. These are all posted right after a talk fin­ishes so they’re rough around the edges.

Susan is a software engineer at Pixar and has also had a long career as a technical writer. She was inspired by last year’s conference and the talks that came out of it.

Her talk is about a story, a journey of career choices and being open to opportunity. She quoted Tom Freston who said, “A career path is rarely a path at all. A more interesting life is usually a more crooked, winding path of missteps, luck, and vigorous work.”

In college Susan wasn’t sure what she wanted to do. She love astronomy, though, and wanted to be an astrophysicist; the problem was that she didn’t truly believe she could finish a physics major. That dream was scratched. She had half the credits toward a math major, though, and pursued that. That switch left her with some free time which she devoted to English literature.

That combination led her to take a technical writing course. It opened up her eyes to the profession of technical writing. At the time she felt that was where she was destined to go. For a few years she followed that path. The community, conferences, passion, learning, flexibility all drew her to the industry. Susan enjoyed learning what customers did with a product. It was about the customer and making their lives easier. But, she started to wonder if there was something more, something deeper.

That led her to dig in to what she found interesting in technical writing and where her passion lay within that. Susan got really in to what made software tick. She’d spent so long understanding how programs worked from a user standpoint. Now she started learning more programming and digging in behind the scenes. She spent the next few years doing API technical writing.

Then, she made her way to Pixar. The manager there wanted to use DITA for a new project. Through that she got the chance to learn Python to write doc scripts and make to execute those scripts. While she never thought of herself as a programmer she had accumulated enough skill to enter that world and make programming her own.

The next challenge came in moving to the software release engineering team at Pixar. This pushed her to an even steeper learning curve where she was learning Perl and deeper levels of Python. The community she’d found in technical writing was still there, but it was mostly internal to Pixar. The mentoring and help came from inside the company but Susan didn’t get the sense of a large external community around these skills. The flexibility she found in tech writing also diminished. On-call hours were expected and work came at all times.

After moving to Canada, Susan changed companies and shifted to an information architecture role at a smaller company. She was now working solo without a strong internal community but was able to stretch her reach in to new roles that had thriving external communities. Unfortunately the company went bankrupt and Susan went back to Pixar as a Software Infrastructure Engineer.

On one level this new role meant reaching her goal of melding documentation and programming. They mandate was to find solutions for doc problems and address tools and process issues. The focus was not just documentation but the entire software life cycle. She worked on tools for the release engineering team, the quality assurance group, and more. At heart she still considered herself a technical writer.

Write the Docs: Drew Jaynes – Putting the (docs) Cart Before the (standards) Horse

I’m at Write the Docs today in Portland and will be post­ing notes from ses­sions through­out the day. These are all posted right after a talk fin­ishes so they’re rough around the edges.

Drew opened the second day of talks Tuesday. He works as a web engineer at 10up and contributes to the WordPress project as well. His talk focused on WordPress’ approach to inline documentation.

Up until about 8 months ago WordPress paid very little attention to the details of inline docs. There were more than 10 years of loosely following phpDoc 1.0. The codex, a wiki-based documentation site, was still seen as the main entry point to learning more.

This wasn’t working anymore with hundreds of thousands lines of code. The codex had grown to 2,300+ pages of manually curated content. That means each release saw the docs team manually going through the codex and trying to create pre-release master lists of required updates in the docs.

Enter hook docs. The docs team decided to create their own doc spec, largely inspired by existing frameworks. They were expecting a large influx of contributors to docs and wanted to set a consistent baseline for new people to the project.

The evolution of the docs team saw them establish themselves as a more official contributor group. They did an informal summit at the Open Help Conference last summer, ran a codex survey to see how people used it, and developed a roadmap for documentation going forward. Never underestimate the power of a docs team armed with a roadmap.

As soon as that roadmap was in place the docs team was off to the races. The top item was to burn the codex with the fire of a thousand suns; but first they needed something to replace it. Work began on a parser and code reference theme and hook docs began in earnest.

At this point the docs team has 3-5 sub teams contributing to various aspects of the docs roadmap. There are weekly chats and virtual sprints over the weekends. Furthermore, the collaboration with other teams has raised the profile of docs within the project. 8 months in to this project they’ve documented every hook in the project, more than 2,200 of them. There was a 48% increase in inline documentation over just 3 releases spanning one year. Those same 3 releases saw 40 new contributors to docs.

The biggest difference is that there are now people paying attention to the details of documentation. It ensures things are consistent, complete, and reviewed. Developing that standard for documentation was immensely helpful. Part of that was taking the docs created over 10 years and developing that standard from the docs. In the long-term this allows them to adopt new aspects in to the spec without causing vast disruption each time.

Write the Docs: Christina Elmore – Death by Documentation

I’m at Write the Docs today in Portland and will be post­ing notes from ses­sions through­out the day. These are all posted right after a talk fin­ishes so they’re rough around the edges.

Christina wrapped up the first day of talks. She opened by saying how she regretted the title for her talk, because her thinking has evolved.

She started her talk speaking about poor presentations. Most people blame PowerPoint for creating bad presentations. What Christina believed, though, was that presentations were trying to document themselves. This is what led to the wall of text on a slide. People were conflating the goals of documentation and presentation.

What’s the difference between the two? First, documentation is self-guided and meant to be read. Presentations are presenter-guided and meant to be heard and watched. With these differences in place Christina proposed a principle, good presentation slides are bad documents. The differences above are key and important. The way our brain processes the information in each differently.

People raise questions about that principle, though.

  1. How will I remember?
  2. What will I share?
  3. What if I’m not there to explain it?

The first question gets at questioning who really matters. You, as the presenter, are not who matters. Your knowledge and memory from the presentation are not central. Its that of the audience. To remember, create a story structure. This benefits both your audience and your working memory. Rehearse your presentation to create retention. These both work for your audience and you.

Matt Haughey’s guide to better presentations covers the question. Ultimately the content of your slides should not be the key thing you are trying to share. As Matt wrote,

The one thing you might notice in my advice and previous slides is that they don’t really make much sense outside of my talk. I’m ok with that, since I hope people get to attend the talk or see video of the talk.

The third question asks if you should be creating a document or a presentation. If you require those walls of text then perhaps the information is not best explained in a presentation. Sometimes the best deliverable is not a slide but a report or a text-based document.

What’s at stake here is that our documents suffer as well when we don’t understand the difference between documents and presentations. To start you can, well, document the difference between these formats of information. One way to look at this is to distinguish between a presentation and a conversation. They aren’t the same thing.

Christina ended with a bold proposal: work to eliminate the presentation within your organization. Amazon and LinkedIn have done this; they have meetings begin with 10 minutes of silence where everyone reads over the same document. Then they discuss things in a true conversation. By doing this you can ensure presentations happen only when necessary. You can then nurture a healthy culture of presentation.

Write the Docs: Amelia Abreu – Data, Documentation and Memory

I’m at Write the Docs today in Portland and will be post­ing notes from ses­sions through­out the day. These are all posted right after a talk fin­ishes so they’re rough around the edges.

Amelia talked about some of the theories behind documentation. She referred to it as a decidedly non-practical talk. Amelia started as an archivist.

The first question she looked at is, what is it that we document? Part of what we do as documentarians is actually looking for what is not documented. At times that’s just as important as what is documented. In her work as an archivist and ethnographer part of what Amelia did was look at that which happened off the record. You explore those areas that are hidden in the official record and poke at whether they can be defined.

Amelia then talked about movements during the early and mid-1900s in France. These were the communities that created microfilm and other standards of archival even during the disruption and oppression of World War II. While we’re in a far different circumstance than 1950s France we can still learn a lot from the work done during these periods. Some of the French researchers believed that shipping microfilm copies of great works of text around the world would help lead toward global peace.

Metadata is at the core of our institutional knowledge. Amelia used the example of Amazon which sold books at close to cost as a means to gather knowledge about the shopping habits of affluent customers.

As documentarians we are the providers of information architecture and infrastructure. Thinking about infrastructure need not be limited to human versus technological components. It can also be thought of in terms of interrelated social, organizational, and technical components or systems. We also must recognize human work as infrastructure. Infrastructures are often used in such a way that exhausts the best intentions of those who created them. Documentation is no exception to this process.

Write the Docs: Maxwell Hoffmann – Say more with less: writing for a global audience

I’m at Write the Docs today in Portland and will be post­ing notes from ses­sions through­out the day. These are all posted right after a talk fin­ishes so they’re rough around the edges.

Maxwell works at Adobe as a product evangelist. He’s spent 12 years in translation and localization with more than 20 years working in traditional technical documentation.

We’ve come full circle in terms of our hand-held devices. We’ve gone from cuneiform tablets to scrolls to desktop computers and now back to hand-held smartphones. This means our content must be in shorter pieces, faster to load, and easier to scan.

The biggest barrier to good translation in Maxwell’s mind is poor English. The more complex the language, the harder the translation. Simplifying your docs and your language simplifies your translation. For hand-held devices this helps as simplified English by as much as 30%. We can all benefit from writing simplified English.

Maxwell covered the 10 rules for writing ASD-STE 100 simplified English. This has 10 key values:

  1. Write positive statements.
  2. Write short sentences.
  3. Use one topic per sentence.
  4. Use valid nomenclature.
  5. Write simple sentences.
  6. Use the active voice.
  7. Do not use gerunds (-ing words).
  8. Avoid conditional tenses.
  9. Avoid word clusters.
  10. Use valid nomenclature & abbreviations.

To work on these rules try writing an outline on a typewriter. With this you can only write; there are no distractions. There’s no editing or formatting. You will feel and write differently. Or, if typewriters aren’t you thing, you can break long sentence writing patterns by dictating your writing.

Write the Docs: Simeon Franklin & Marko Gargenta – TechDocs at Twitter: Creating the Culture of Documentation

I’m at Write the Docs today in Portland and will be post­ing notes from ses­sions through­out the day. These are all posted right after a talk fin­ishes so they’re rough around the edges.

Marko and Simeon both work at Twitter on documentation projects. About 9 months ago Twitter acquired a software training company Marko and Simeon both worked at. The goal was to create Twitter University for internal training and learning. It didn’t take long, though, for someone to suggest they tackle the technical documentation as well.

The technical documentation they work on at Twitter is internal-facing. It’s engineer-to-engineer documentation and not the public API specs. They ran a survey about the internal docs. Using the net promoter score standard they found that most people hated the internal docs. Some common complaints were that the docs were incomplete, deprecated, non-existant, or unclear.

Their technical docs team was just 3 doc writers. They view the team as the engineering grease internal to the company. The goal is to make things run smoother. To do this they’re making docs like code. The process they’re working toward is to code, test, doc, and then ship.

They’re taking lessons from the test-driven-development shift to bring docs in to that process. Developers at Twitter have gotten accustomed to needing tests finished before they commit code. The challenge is to make that cultural assumption the same around docs.

The plan is three-pronged. The platform is DocBird, the publishing platform. Above that lie self-service tools and templates. Finally there’s DocDay, an internal doc sprint that aims to promote documentation internally. As they got started they realized there was no coherent plan or standard to docs. Different teams used different tools, formats, and processes. Some teams were even using Google Docs. Their internal wiki has more than 60,000 pages across more than 150 spaces.

In treating their docs as code they’re checking the docs in to git, those docs are plain text, and they’re regularly reviewed and merged. This is where technology comes in to the rescue. They built a homebrew documentation platform they call DocBird. They hit a few roadblocks in this process. The fact that DocBird is not a wiki, isn’t ReStructuredText, and isn’t generated from the source code were all initial problems.

Since the docs live in the code repo they’re able to track the amount docs drift away from code. The goal is to keep things in sync, but it’s a challenge.

The template layer of their project is aimed to enabling developers to get to minimum viable documentation as fast as possible. They’re working to enforce consistency in documentation through those same templates; things like a similar language are important here. It solves the problem discussed in an earlier talk about how intimidating a blank page is. By filling some default language in to the template they help developers get over that initial hurdle.

In terms of coverage the goal is to have 100% of known projects covered with the same docs. They’re bootstrapping this by starting with readme files that point to all the docs that exist. They don’t all have to be in DocBird right away.

With only 3 people using resources well is key. They’re pushing education as a way to leverage those three documentarians to have maximum impact. By working on templates, for example, 3 documentarians can influence hundreds of docs. Ultimately better technology alone will not create a culture of documentation.

The key problem is getting Twitter’s thousand+ engineers to write docs. On the one hand they’re telling the engineers that a lack of docs should be considered technical debt. On the other, though, they’re telling the engineers that if they want other teams to adopt their library they need to package it attractively; docs are a key part of that package.

Write the Docs: Kenneth Reitz – Documentation at Scale

I’m at Write the Docs today in Portland and will be post­ing notes from ses­sions through­out the day. These are all posted right after a talk fin­ishes so they’re rough around the edges.

Kenneth wrapped up the set of talks before the afternoon break. He works for Heroko, a web application deployment company. He’s also a member of the Python Software Foundation and an active member of the community. He seriously contributes to about 18 projects and has around 100 little experiments.

His talk focused on language, and how it works within documentation. Spoken language is what allows us to express ideas to one another. Our written language abilities then allow for us to express those ideas over a historical time scale. As we’ve developed these skills the human hardware hasn’t changed, but the software’s been upgraded.

As we’ve evolved our ability to communicate with one another has expanded from one-to-one conversations to one-to-many. The printing press, for one, gave the ability for a single entity to communicate to an extremely large mass of people. This continued with newspapers, books, television, and radio. This formed the narrative of “the public.”

The next step in communication is many-to-many. This format is inherent to the internet. If you have access to the internet you have access to a universe of information and ideas. The implications of this shift are huge.

Documentation doesn’t have to be about a software workflow or open source project – it can be used to convey ideas much larger than yourself. Its information architecture is inherent and a powerful tool for developing ideas over time. Finally, it enables us to develop ideas on a longer timeline and with more people than we’d ever be able to do by ourselves.

Kenneth works on The Hitchhiker’s Guide to Python, an open source project that’s outline and curated through a combination of Github and Read the Docs.

Write the Docs: Lois Patterson – What Makes Good API Documentation: An Example-Based Approach

I’m at Write the Docs today in Portland and will be post­ing notes from ses­sions through­out the day. These are all posted right after a talk fin­ishes so they’re rough around the edges.

Lois got interested in this subject through working on an API product she had to document. The company she works for has a very complex project that they were building an API for. They saw the API as a means for letting developers to narrow the scope of functionality available to certain user bases.

To start on this project Lois researched other good API doc sets on the web. What she found was that if you have a good API the docs are quite likely to be good as well. Twitter, Github, Facebook, and Flickr were cited as examples here. Why are these good APIs? First, they segment documentation in to buckets. Twitter has API documentation broken in to tweet, user, and other specific aspects. Similarly Flickr’s API Garden gives you concrete examples of how other developers have used the API.

While there aren’t inherent API standards those APIs which become extremely popular become de facto standards. The ProgrammableWeb service is a good resource for searching through various public web APIs.

Swagger and Atlassian’s REST browser are handy tools for exploring and testing the content within an API.

Lois argued that any good API documentation should have good descriptions, tutorials, code examples (both snippets and working apps), a sandbox test environment, as well as FAQs. At Lois’ company someone tried to argue that code examples weren’t needed because, “any good developer can figure out how to make a working application.” While they may be technically able to figure it out the job of your API docs is to make that process easier. Clear versioning for the API as well as suggestions on using the API are also helpful.

A first pass on API docs can be as simple as taking the spec and turning it in to a user-facing doc. Taking that to the developers then gives you feedback to go from and get a set of points to clarify.