It's been nearly 20 years since I started my first job. Since then, the IT industry has undergone significant changes, and in this recounting, I want to describe how it has evolved.
Back in 2000, not many people aspired to be programmers, and the term "programmer" wasn't commonly used. The mass adoption of the internet was just beginning, and the average internet user didn't question how it worked or who was behind it. This was the era of numerous web studios, companies that offered to create websites in the then-small internet landscape. There was nothing beyond websites meant for browsing catalogs and finding phone numbers.
In essence, the product itself was simple, but the underlying technology was complex. It was nearly incomprehensible for an ordinary person to understand how it all worked. IT companies were mainly founded by individuals closely connected to computers, those who tinkered with them on their own. The founders of such companies often hired based on simple tests or allowed candidates to work for a period to assess their compatibility. Almost always, these candidates were computer enthusiasts with no issues working with computers. They were people willing to take a job just to be closer to computers.
A sharp upswing in the industry occurred between 2005 and 2012, producing many talented individuals who continue to work in IT to this day. The number of people eager to work in the IT industry became enormous. Previously, one person, often referred to as a "programmer" (sometimes even a "system administrator"), did all the work. There was no division of labor into architects, business analysts, QA testers, developers, etc. All these roles and responsibilities were combined in one person who managed to handle everything 100%, otherwise, we wouldn't have what we have today.
In teams where almost everyone could do everything, tasks were allocated based on expertise. If someone had more experience with Linux, they would handle tasks related to configuring Apache/Nginx. They would then explain what they had done, and others would admire their achievements. It was the golden age of IT. By the way, you can probably see similarities here with the cross-functional teams everyone talks about today. With the emergence of Web 2.0, different system patterns emerged. People started to "desire" (in reality, they were simply shown that this was possible, albeit not strictly necessary) certain enhancements, such as pages that didn't require full reloads. Due to this innovation, work dynamics began to change. This is precisely where the terms "frontend" and "backend" started to gain their current meanings. The division of labor began. At that time, company owners and managing managers tried to hire those who could handle both backend and frontend, what is today known as a "full-stack" developer. They viewed such endeavors with skepticism, as they questioned how someone could be proficient in only half of the equation (as it turns out, they were right to ask such questions).
To be a complete web developer in those days, one needed to know how to:
Set up a web server.
Install a programming language (often from source code).
Connect the programming language to the web server through a CGI module, such as mod_php, mod_python, or mod_perl for Apache.
Install and configure a database (with knowledge of advanced settings).
Have knowledge of MySQL/PostgreSQL.
And all of this was often done on Windows, with rare exceptions on Linux. To achieve this, a minimum level of knowledge about Apache and basic Linux skills was required, even when setting up under Windows and especially through Cygwin.
Later on, installers emerged that packaged these components into a single package and were managed through the Windows system tray menu. This streamlined the initial installation process but didn't necessarily reduce the complexity of understanding what was happening.
Thus, a typical developer 15 years ago needed to know three main components: systems administration, a programming language, and front-end development with JavaScript (HTML+CSS+JavaScript, usually with jQuery).
Over these two decades, many custom frameworks and libraries were developed. The phenomenon was that every programmer who reached a certain level of expertise wanted to create their own framework that would enable them to work more efficiently. Much of this work was open source and was shared within small communities in IRC chats.
The frameworks that survived to this day became widely used—Django, Flask, Symfony, Rails, Catalyst, and others that endured through the years.
All of these frameworks developed back then followed the MVC (Model View Controller) paradigm, which is still prevalent in some form today. However, just as then, writing on such frameworks and achieving full functionality required full-stack knowledge. Databases MySQL and PostgreSQL were used almost universally. In skilled hands, these databases allowed for the development of fairly large projects without the need for external optimizations or additional architectures. A developer from that era could single-handedly or within a team support such significant projects. What constitutes a significant project will be discussed below.
In 2010, server-side JavaScript emerged in the form of Node.js, which prompted those frontend developers to transition into the world of backend programming. Naturally, these individuals lacked the necessary knowledge, making it challenging for them to develop entirely in JavaScript. They essentially had to learn backend development anew. They were inventing frameworks that already existed but applying them to JavaScript. This situation became even more pronounced with the introduction of npm, the package manager for JavaScript.
Starting from 2013, there was a massive boom in the development industry, and everyone wanted to enter IT. This led to the emergence of numerous courses, and companies began hiring interns who worked for minimal pay or even for free until they started to grasp the concepts. This boom continues to yield its somewhat undesirable results, mainly that a large number of individuals with little to no prior computer experience found their way into IT to earn what they consider to be a high salary. Almost all the courses offered focused on either JavaScript or Python. These courses often promise job placements after completion.
After some time, several other popular system languages emerged, such as Rust and Go, which also gave rise to web frameworks.
Let's consider a typical task that exists in every project: registration, authentication, and a user dashboard with buttons. Fifteen years ago, a single developer could accomplish this task with a full understanding of each system component. "Browser -> HTTP POST -> API -> Database -> INSERT INTO" ... and so on. The developer's skill level wasn't as important, as they would have had to perform all these actions for it to work. However, today's modern developer would struggle to complete such a task independently without frameworks and an understanding of what's happening behind the scenes.
The division of labor among programmers led to the separation of work into two parts: "Browser -> HTTP REQUEST (frontend)" and "API -> Database -> INSERT INTO (backend)". Consequently, an average developer after completing courses would be unable to complete this trivial task as a whole. Another issue is the use of frameworks without fundamental knowledge. Suppose a person chooses to use a full-stack framework like Next.js, which allows them to handle both backend and frontend without understanding the inner workings. With examples from the documentation and a few attempts, almost anyone could implement a simple task, such as user registration via email.
This raises a valid question: why did we separate work only to return to the full stack, when we always knew that a person skilled in backend could do more than someone skilled only in frontend? The answer to this question is straightforward and consists of two parts. Firstly, there was no single language with the same syntax that could work both in the browser and on the server. Secondly, the sale of software or some software solution often occurred through a demo or prototype, which usually featured a frontend interface. For a business to obtain a prototype, having one frontend programmer who creates the visual part and possibly some server-side functionality is often sufficient.
What about graphic design? In those days, website graphic design was presented in the form of PSD files, which were provided to the developer, often the frontend developer. From these files, they would slice the necessary graphical elements themselves and carry out the layout. Concerning different screens and browsers, this wasn't a major concern at the time, and if it was, not much was done about it. PSD files were essentially a set of entire page layouts rendered as raster graphics. Occasionally, if the design was good enough, a UI Kit would be prepared, containing separate sets of elements in layers that could be exported and used almost immediately.
Then Docker emerged, which relieved those who didn't know Linux from having to learn it. Everything started revolving around containers, and there was no longer a need to understand the specific settings of services running under Linux. This led to a shift where surface-level knowledge turned into knowledge of passed configuration, rather than a deep understanding of how things work inside.
It's worth mentioning HR. Twenty years ago, they were almost nonexistent. Everyone looked for acquaintances interested in programming, so the hired individuals were often highly motivated and enthusiastic. Nowadays, the average hiring process goes through HR. HR has become a separate industry with its own tools for handling potential candidates.
What do modern interviews look like? They mainly consist of cliché questions about algorithms and general problem-solving. This approach is not only useless for businesses as it doesn't provide the necessary business-related experience, but it also harms the industry by perpetuating itself. Recently trained newcomers, having experienced such interviews, go on to ask the same questions of other newcomers. What's the problem with this approach? It has nothing to do with web development, where 90% of tasks involve saving values from a frontend form to a database, while the remaining 10% involve processing, transforming, and executing actions. Any developer, even an intern who completed courses, can handle the 90%, while the rest is best left to more experienced professionals. So, why conduct complex interviews if you don't have tasks that align with the questions you're asking? They ask questions about O(n) or O(log n) complexity to individuals who are being hired to design forms in ReactJS and send JSON to an API. Fifteen years ago, no one in web development knew about O(n), and these questions weren't a part of everyday work. However, when problems arose, solutions were found. Naturally, not everyone is automatically suited for interviews; it's more about a 30% (common sense) to 70% (lack of common sense) ratio.
What does modern developer work entail, and how does it differ from work 20 years ago?
Let's start with the past:
70% of tasks involved fetching or storing information in MySQL and displaying it cleverly on websites.
20% were complex tasks for processing, usually triggered via cron in the background at night.
10% involved server setup, configurations, backups, and firewalls.
Communication happened on custom IRC servers, Jabber, email, and ICQ.
No online calls (yes, 15 years ago, nobody connected with others online to discuss issues).
Googling, or more accurately, searching, wasn't very helpful; everyone read official language documentation and found examples on specific forums with developers of those languages.
Work was often driven by enthusiasm; tracking time was rare, but deadlines existed.
90% of work was done in the office to have access to computers.
90% of workers had their own computer at home.
99% of developers played PC games.
90% engaged in programming "for fun" after work.
There was no Git and no GitHub.
Initially, there was CVS or SVN; later, Git emerged, but mostly, no one aimed to use them efficiently.
No CI/CD; deployment was done via a bash script or a bash script that ran directly on the server and performed an SVN update.
No language package managers; only system package managers were available.
Text editors and IDEs then were: Emacs, Vim, FARManager, EditPlus+, Notepad++, IDEA, Eclipse.
Now let's consider the present:
90% of tasks involve submitting forms or displaying lists of information, usually through JSON APIs with subsequent storage in PostgreSQL / MongoDB / FireBase, or other managed cloud storage solutions. Nobody manually configured databases and few knew the parameters of fine-tuning.
10% of tasks require Googling for over 30 minutes.
Almost all tasks are primarily solved through Google searches, taking about 5-15 minutes, with solutions directly copied and pasted into code, verified, and pushed to production.
Communication occurs on modern platform messengers like Slack, Teams, etc., and calls happen on Zoom, Google Meet, Teams, Slack.
Calls throughout the day are common, especially in less experienced teams, where many engineering problems are discussed vocally.
~80% of people work remotely from home.
Time tracking varies; it can involve screen tracking tools on corporate laptops or manual time sheets.
Equipment is provided by the company in 80% of cases; only 15% buy their laptop, and the rest use desktops.
<20% play PC games, and another 10% have gaming consoles.
<20% engage in programming "for fun" after work.
Numerous CI/CD tools are available to suit different preferences, but they're often used with default configurations as a deep understanding of their workings is not required.
In summary, the phenomenon of a widespread interest in the IT industry has led to changes that involve a broader mix of individuals entering the field, a shift in work dynamics, a dominance of certain programming languages, and a notable simplification of entry-level tasks. However, at the same time, the industry has become more complex for those seeking to advance their skills. The transformation from dedicated enthusiasts to a diverse workforce with varying motivations is evident, accompanied by changes in work culture, tools, and practices.
The paradigm has shifted from computer enthusiasts with custom built computers towards corporate workers with smoothie and rented macs.
This text is only a draft, and there are a lot of details needs to be described in there, I’ll continue to explain if you’re interested in it. Feel free to comment.