A look into the past, present and future, by a frontend developer
25 years ago when I started creating websites, things were kind of easy. It was the very early days of the world wide web and you were able to create everything on your own in just a few days for most client projects. Than things became more and more complex as clients, users and our very own expectations increased year by year. People started to specialise and in each field developers have now more skills than ever before and need to know more than the full stack developer of the past.While I don’t think it will ever be as easy again as at the beginning, I think we are on the right path right now for more people to become full stack developers once again. That’s why I gave a talk recently at the Frontend Conference and why I try to motivate people to go the frontend first, full stack direction.What I want to lay out in this post are the key technologies and development patterns that will bring us there, technologies that close the gaps between different fields: Design, frontend and backend development. Technologies that bring things together again in areas were we lost the important concept of a single source of truth, technologies that remove friction between different teams or bringing those teams together completely. And last but not least, reduce and hide complexity of each field so that we can leverage that freed up time and mental capacity to improve our products even further, making our clients happy.This is an opinion piece of how I see the current situation as a developer and it’s a view into the not so distant future, as many of the technologies I will talk about have not completely matured.
As a frontend developer, the main improvement in recent years was when React made component based layouts popular. jQuery, the previously major library that everybody used, was great for small things, but as things became more complex, as more and more business logic moved from the backend to the frontend, the need for a new paradigm was ripe. Backbone.js was kind of a middle step we used for a while, but it was far from perfect. With React and components, it became possible to construct big complicated things out of simple small parts. Other frameworks like Angular and Vue followed the lead and we now have many great options to choose from. While jQuery still has its place in small projects, as soon as you need to create new elements on a page, elements that need to interact with each other, with DOM updates that perform well, there simply is no way around components these days.
Next it was time to get rid of the MVC (Model, View, Controller) architecture pattern. The problem with MVC on the frontend was that it made debugging harder and harder as bigger the applications grew. This happens mainly because multiple controllers can send and receive the same events, creating a tangled mess if you’re not taking enough care of everything. Especially on bigger projects with multiple developers, it’s almost impossible to know how everything is connected, leading to really hard to debug and optimise codebases. The Flux pattern, and especially the Redux library that improved the concept, solved this by introducing the concept of a single direction that data could flow through the application. Additionally Redux added the concept of immutability so that you always knew what state followed which state and which actions caused which changes. This made debugging easy again, reducing a huge amount of complexity. Interestingly, we got this new powers, by limiting the programmer in what he could do. Basically forcing the programmer to not create a mess.
After that, the same people that made components and flux popular in the frontend developer community, came up with a solution to the pain points we had in communicating with backend services. While many still praised REST for its simplicity at that time and were thankful to finally have gotten rid of over engineered API protocols like Soap, these guys identified the problems with REST, especially on mobile connections, and created the next step that would not only solve most of the problems we faced with REST as endpoints became more complex, but enabled tooling that is the source of further improvements we’ll look at later. And the solution was quite simple. With GraphQL you now only have one endpoint and you can query whatever you want over that one endpoint. This removed the need for multiple queries and solved the issue of over- and under-fetching in non custom REST APIs and with that, the performance issue of having to query the server multiple times to get all the data we need. And because a GraphQL endpoint can be inspected, you not only always know the capabilities of an API just by using it, it enabled the creation of a lot of amazing tools on top of the different APIs, solving things like caching and documentation, just to name two.
What GraphQL enabled and where I’m involved mainly in the global open source community, is making realistic mocking of APIs possible. Because GraphQL enabled tooling and because you have now one single, super flexible endpoint, it opened the possibility to perfectly and super realistically mock an API in a way, that was simply impossible with REST. GraphQL engines like Prisma and Hasura can now mock full featured, realistic APIs based on just a schema definition. Or in the case of Hasura and my own library, Blowson, we don’t even need to manually fill in sample data anymore, we can generate sample data very easily with minimal effort. This makes it possible to iterate super fast on our APIs and frontends without ever having to implement a single line of backend code. Whoever tried to migrate APIs knows exactly how difficult and time-consuming this can be if you have to work with real user generated data, but if you work with realistic sample data, changes become almost a non issue. I’m personally super exited about this development, as it leads to faster iterations and ultimately, to happier clients.
Another very difficult problem that is currently being solved, again thanks to the tooling enabler GraphQL, is the problem of caching. While we mostly leveraged CDNs (content delivery networks) for media files in the past, many new services now became much more flexible and powerful. For example there are now multiple services that can cache GraphQL API request on the edge of the network. They can do that by analysing the GraphQL queries we send through those services. By knowing the capabilities of an endpoint, and because GraphQL is schema based, a CDN can exactly know when to ask the server for fresh data and when they can deliver it from their service at the edge of the network. This type of caching can also cache data that used to be very complicated or even impossible to cache, such as user data or a shopping carts.
CSS in JS
Not every backend technology makes sense to build with Serverless however, for example it makes still sense to have a dedicated content management system and a central database. While traditionally a CMS rendered markup, this is becoming less and less the case. A big amount of new comers made headless popular, by completely decoupling the frontend from the backend. This has multiple advantages and very few disadvantages as I’m laying out at Frontend First Development . And now even established enterprise CMS like Drupal are moving into this direction. For a frontend developer it never made much sense to have markup rendered by the CMS, especially since frontend have become more and more complex. And changes in how we started building layouts, for example with flexbox, or stuff like accessibility and HTML 5, made it super important, that markup became a very specific shape, often widely different from the default output of a CMS. By removing the need to adapt CMS rendered markup, we are now able, thanks to headless systems, to have one single source of truth for markup, create by the experts of how that markup has to be constructed.
Component integrating design tools
Lastly a new breed of design tools are now becoming available on the market that try to bridge the gap between design and frontend code. Most notably this are: FramerX, Alva, BuilderX and with a slightly different focus: Figma. Basically what these tools are doing is making components from the frontend available inside design tools, solving another part of our developer pipeline where we had multiple sources of truths. This way designers can layout pages and modules with real components and make sure that everything will look and work the same way in the design as in the final implementation. Some companies like Dropbox for example, already started to use FramerX to optimise communication between designers and frontend developers.
Predictions of the future are very hard and they mostly fail to be accurate, but they are a great instrument to guide development in the right directions. I was lucky to predict that React, GraphQL and Flux would become popular very early, so maybe I will be lucky with the following predictions, as well. Well, wish me luck.
Database agnostic CMS
Currently all major enterprise content management systems and even most of the new breed of headless systems, they all manage their own databases and APIs. I think this will change drastically. GraphQL engines are much more flexible and better optimised for cloud infrastructures then any of the traditional database implementations CMS use these days. I think the next logical step is to completely decouple the database from the content management system. We already started to decouple the rendering of markup with the rise of headless systems and this will just be the next logical step. GraphQL engines will be the ORM of future content management systems, bridging the gap between a CMS and a GraphQL API in a nicely decoupled way.
Frontend first development workflow
All the above mentioned technologies make it possible, to work on a frontend prototype much longer than in the past. We know that clients understand problems much better if they can see and interact with it, and by faking the backend and creating much more realistic prototypes, we can give them that ability without loosing the speed of working with completely fake data in static site prototypes as we mainly do now.
Visually we already brake down everything into tiny components with the new breed of component based frontend frameworks and I think the same thing is happening on the backend, as well. Serverless functions and microservices will do more and more of the tasks that were in the past done by a CMS and this trend will not stop.
CMS with a focus on the content editor
Many of the currently popular enterprise content management systems will have to give responsibility away to microservices. This gives them a chance to focus more on the content editors perspective of using the product. Usability will improve drastically as CMS teams can focus on the user experience and not on implementing backend services from scratch.
Admin interfaces will be highly customised. Not only will we theme admin interfaces for clients, we will be able to completely customise each admin interface for each role of a content editor. This will make technology much more accessible and will be a huge focus of backend and frontend developers. Thankfully the Drupal team is working into this direction already.
Design Systems become central in all projects
As design tools and frontend technologies move closer together, design systems will become more and more the basis for everything. This will make it it possible to adapt sizes, whitespace, colors, fonts and much more at any time during the development of a project without braking stuff.
Contact for your Digital SolutionBook an appointment
Are you keen to talk about your next project? We will be happy exchange ideas with you: Melanie Klühe, Stefanie Berger, Stephan Handschin and Philippe Surber (clockwise).
Contact for your Digital Solution with UnicBook an appointment
Are you keen too discuss your digital tasks with us? We would be happy to exchange ideas with you: Jörg Nölke and Gerrit Taaks (from left to right).