I’m hearing on twitter, mastodon, and other social networks that people don’t trust surveys of developer opinions or needs. I’m here to say, good, you probably shouldn’t!
Every source of information has a bias of some kind. You know you are doing the right thing when you find directional alignment across different sources of information and take into account their biases. For example, npm installs are helpful, but they bias toward sites under active development. Sometimes that’s what you want, but as a single source of truth, it would fall short.
There are some fantastic (public!) resources for understanding developer sentiment, and some notes about how to put them into context to understand their biases. Are there other resources that should be included? Put them in the comments! or @ me on any of the social networks
Data source | Description / Caveats |
State of CSS | 2019, 2020, 2021, 2022, 2023. Yearly survey of front end developers about CSS. 2021 had far fewer respondents. Publicized on twitter. Skews toward developers on social media. ~9K respondents. [demographics] |
State of HTML | 2023. Not published yet. Run in Q3/4 of 2023 for the first time. |
State of JS | 2016, 2017, 2018, 2019, 2020, 2021, 2022. Yearly survey of front end developers about JavaScript. Skews toward developers on social media. ~40K respondents [demographics] |
Tash.io State of web Development | Survey. The conclusions seem odd. Potential bias. |
npmtrends.com | Compare relative popularity of different ecosystem solutions to a problem. This links to popper libraries. Great for understanding the part of the web that is still maintained and addressable. Can be biased by CI and other automated install processes so it is better to consider relative usage rather than absolute numbers. In 2018/2019 there was little difference between on premise enterprise installs versus public usage. I don’t know since then. It won’t include information about sites that aren’t being updated. Often that is ok, but it can’t be used to represent the web as a whole. For example, abandoned wordpress blogs (like mine was) would not be included. Popular libraries have millions of weekly downloads. |
Twitter polls | Quick temperature checks to decide if I want to run full surveys. Audience bias inherent to twitter. |
Thread on missing HTML components. | |
Developer spec reactions | List of spec issues with 10+ reactions. Skews toward standards savvy developers. |
Design system survey | Design system survey by Sparkbox in 2021 |
Stack overflow survey | Analysis of developer questions and answers on Stack Overflow |
State of design systems | Design system survey by Google design in 2019 |
Bug star counts | Bug star counts by area: CSS, Layout, Print |
Greensock forums | Popular paid professional quality animation library |
HTTPArchive | Tracks tech used on logged out homepages. Biases toward lower tech marketing pages. Occasional increases to sampled URLs make comparing results over time a bit muddy. Animation in HTTP archive |
MDN Browser compat report | Mozilla developer network survey. Large scale survey. This was a one time survey. ~30K respondents. |
Microsoft form controls survey | From a presentation Greg Whitworth and I did at CDS. |
The state of design systems | Design system survey by material design in 2020 |
Open UI | Open UI github, meeting agendas, and notes. |
Design guidelines and design systems | Apple’s human interface guidelines Material design Adobe Spectrum Shopify polaris Airbnb motion design Figma’s design system list Design system repo Salesforce lightning They vary a bunch, so you have to look at more than one. And they don’t take into account newer trends like Tailwind. |
Frameworks | Direct feedback from framework authors helps identify if features are adoptable across various parts of the ecosystem. This includes frameworks like Angular, Vue, and React. It also includes CSS frameworks like Bootstrap and Tailwind. |
Developer interviews | Nothing beats actually talking to developers about issues they are having and how different technologies work for them in their codebase. Generally, developers feel positively about web development, so the results bias positive. To get a more accurate read, often you need to ask how devs feel about themselves (are you struggling to keep up?) versus how they feel about the web (are the right number of new features being released?) |
Web page data (weighted or not by page views) | This data can be very helpful for putting other pieces of data in context. However, it biases toward what is shipped to the web, a lot of which can be legacy or not maintained. Developers won’t make changes to sites that aren’t under active development. It is archeological. |
Core Web Vitals | Great for understanding and comparing perf. The metrics are very well vetted and influence search ranking. They bias toward MPA rather than SPAs because they don’t measure the speed of subsequent SPA page loads. More to come on this. |
Keep in mind, each of these can be both fantastic and biased. Check things like sample size, participant selection, and subsection of the ecosystem represented.
Comments
One response to “What are developers thinking?”
The biggest problem with surveys is always reaching a representative cross-section of respondents. Depending on experience and level of knowledge, people answer differently. Survey results should always be evaluated on the basis of their survey method. Thanks for the summary