LukeW
Publishing in the Generative AI Age
Hey Luke, why aren't you publishing new content? I am... but it's different in the age of generative AI. You don't see most of what I'm publishing these days and here's why.
The Ask Luke feature on this site uses the writings, videos, audio, and presentations I've published over the past 28 years to answer people's questions about digital product design. But since there's an endless amount of questions people could ask on this topic, I might not always have an answer. When this happens, the Ask Luke system basically tells people: "sorry I haven't written about this but here's some things I have written about." That's far from an ideal experience.
But just because I haven't taken the time to write an article or create a talk about a topic doesn't mean I don't have experiences or insights on it. Enter "saved questions". For any question Ask Luke wasn't able to answer, I can add information to answer it in the future in the form of a saved question. This admin feature allows the corpus of information Ask Luke uses to expand but it's invisible to people. Think of it as behind-the-scenes publishing.
Since launching the Ask Luke feature in April 2023, I've added close to 500 saved questions to my content corpus. That's a lot of publishing that doesn't show up as blog posts or articles but can be used to generate answers when needed.
Each of these new bits of content can also be weighted more or less. With more weight, answers to similar questions will lean more on that specific answer.
Without the extra weighting, saved questions are just another piece of content that can be used (or not) to answer similar questions. You can see the difference weighting makes by comparing these two replies to the same question. The first is weighted more heavily toward the saved question I added.
Using this process triggered a bunch of thoughts. Should I publish these saved questions as new articles on my blog or keep them behind the scenes? What level of polish do these types of content additions need? On one hand, I can simply talk fluidly, record it, and let the AI figure what to use. Even if it's messy, the machines will use what they deem relevant, so why bother? On the other hand, I can write, edit, and polish the answers so the overall content corpus is quality is consistently high. Currently I lean more toward the later. But should I?
Zooming up a level, any content someone publishes is out of date the moment it goes live. But generated content, like Ask Luke answers, are only produced when a specific person has a specific question. So the overall content corpus is more like a fully malleable singular entity vs. a bunch of discrete articles or files. Different parts of this corpus can be used when needed, or not at all. That's a different way of thinking about publishing (overall corpus vs. individual artifacts) with more implications than I touched on here.
Smashing Conf: Is Atomic Design Dead?
In his Is Atomic Design Dead? presentation at Smashing Conf New York, Brad Frost discussed the history of design systems and today's situation especially in light of very capable AI models than can generate code and designs. Here's my notes on his talk.
- Websites started as HTML and CSS. People began to design websites in Photoshop and as the number of Web sites and apps increased, the need for managing a brand and style across multiple platforms became clear. To manage this people turned to frameworks and component libraries which resulted in more frameworks and tools that eventually got integrated into design tools like Figma. It's been an ongoing expansion...
- There's been lots of change over the years but at the highest level, we have design systems and products that use them to enforce brand, consistency, accessibility, and more.
- Compliance to design systems pushes from one side and product needs push from the other. There needs to be a balance but currently the gap between the two is growing. A good balance is achieved through a virtuous cycle between product and systems.
- The atomic design system tried to intentionally define use of atoms, molecules, organisms, templates, and pages to bridge the gap between the end state of a product and a design system.
- As an industry, we went too far in resourcing design systems and making them a standalone thing within a company. They've been isolated.
- Design system makers can't be insular. They need to reach out to product teams and work with them. They need to be helping product teams achieve their goals.
- What if there were one global design system with common reusable components? Isn't that what HTML is for? Yes, but it's insufficient because we're still rebuilding date pickers everywhere.
- Open UI tracks popular design systems and what's in them. It's a start to seeing what global component needs for the Web could look like.
- Many pattern libraries ship with an aesthetic and people need to tweak it. A global design system should be very vanilla so you can style it as much as you want.
- The Web still has an amazing scale of communication and collaboration. We need to rekindle the ideas of the early Web. We need to share and build together to get to a common freely usable design system.
- AI models can help facilitate design system work. Today they do an OK job but in the future, fine-tuned models may create custom components on the fly. They can also translate between one design system and another or translate across programming languages.
- This methodology could help companies translate existing and legacy code to new modern design systems. Likewise sketches or mockups could be quickly translated directly to design system components thereby speeding up processes.
- Combining design system specifications with large language models allows you to steer AI generations more directly toward the right kind of code and components.
- When product experiences are more dynamic (can be built on the fly), can we adapt them to individual preferences and needs? Like custom styles or interactions.
- AI is now part of our design system toolkit and design systems are part of our AI toolkit.
- But the rapid onset of AI also raises higher level questions about what designers and developers should be doing in the future? We're more than rectangle creators. We think and feel which differentiates us from just production level tasks. Use your brains, your intuition, and whole self to solve real problems.
Smashing Conf: How to Use AI to Build Accessible Products
In her How to Use AI to Build Accessible Products presentation at Smashing Conf New York, Carie Fisher discussed using AI coding tools to test and suggest fixes for accessibility issues in Web pages. Here's my notes on her talk.
- AI is everywhere. You can use it to write content, code, create images, and more. It impacts how everyone will work.
- But ultimately, AI is just a tool but it might not always be the right one. We need to find the tasks where it has the potential to add value.
- Over 1 billion people on the planet identify as having a disability. Accessible code allows them to access digital experiences and helps companies be complaint with emerging laws requiring accessible Web pages and apps. Businesses also get SEO, brand, and more benefits from accessible code.
- AI tools like Github Copilot can find accessibility issues in seconds consistently, especially compared to the manual checks currently being done by humans. AI can also spot patterns across a codebase and suggest solutions.
- Existing AI coding tools like Github Copilot are already better than Linters for finding accessibility issues.
- AI can suggest and implement code fixes for accessibility issues. It can also be added to CI/CD pipelines to check for accessibility issues at the point of each commit. AI can also serve as an accessibility mentor for developers by providing real-time suggestions.
- More complex accessibility issues especially those that need user context may go unfound when just using AI. Sometimes AI output can be incomplete or hallucinate solutions that are not correct. As a result, we can't over rely on just AI to solve all accessibility problems. We still need human review today.
- To improve AI accessibility, provide expanded prompts that reference or include specifications. Code reviews can double check accessibility suggestions from AI-based systems. Regularly test and refine your AI-based solutions to improve outcomes.
- Combing AI and human processes and values can help build a culture of accessibility.