This is just trying too hard. "Servant Leadership" is a buzzword invented to divert the general opinion from the power mechanics that hierarchical organizations are funded upon, i.e., the boss (sorry, leader ) commands and the direct reports execute. Being "servant" basically just means being a decent human being, as per putting people in the right condition to carry out their duties, not coming up with unrealistic expectations, and do the required 1:1 coaching/mentoring for career development. Hand-helding employees as this "blocker removal" interpretation of servant leadership seems to imply is just the pathway to micromanagement. It's ok to shield your juniors from the confusing world of corporate politics, but if your direct reports need you to do a lot of the sanitization/maturation of work items and requirements then why should you even trust their outputs? At that point you're basically just using them as you would prompt an AI agent, double- and triple-checking everything they do, checking-in 3 times a day, etc. This "transparent" leadership is the servant leadership, or what it's intended to be anyway in an ideal world. Some elements of it are easily applicable, like the whole coaching/connecting/teaching, but they also are the least measurable in terms of impact. The "making yourself redundant", i.e., by avoiding being the bottleneck middle-man without whose approval/scrutiny nothing can get done is fantasy for flat organizations or magical rainbowland companies where ICs and managers are on the exact same salary scale. And it will continue to be as long as corporate success (and career-growth opportunities) is generally measured as a factor of number of reports / size of org. managed.
"Servant leadership" is not a buzzword but it's been misused and abused by Big Corporations to the point that it basically lost its meaning [1]. For me - personally - the idea is about being less of a boss and more of a nightwatchman or janitor. I believe in agency and ownership and - in sane environment - people can be left alone with clear objectives. It's more about removing obstacles. I'll give you a simple example. Once a week a maid comes to our apartment. Despite a clear power balance disproportion (it's easier to find a new maid than a senior engineer) and her being used to being transparent and prioritizing to not disturb tenants for me it's the other way around. I'm super happy to hastily finish a call or leave my room is she feels the need to disturb me, and if she needs an extra pair of hands I'm happy to help her with anything. After all, I'm more interested with the final result than feeling important. We have a bucket list of tasks than has to be performed that slightly exceeds her capacity and she has a full right to prioritize things. It took my a while but I eventually convinced her that it's ok to skip things - like cleaning the windows - if she's feeling under the weather or it's cold outside rather than faking it. Most of the pointy hairs I worked in corporate environments would probably prepare a list of requirements and walked through the apartment with a checklist every time she would finish giving her a full, harsh performance review. But that doesn't build trust and long term relationship. And after some time she developed - what people around here call ownership - and sometimes I feel she cares about the household more than I do. Hope that makes sense.
If you read the original JavaScript press release ( https://web.archive.org/web/20020808041248/http://wp.netscape.com/newsref/pr/newsrelease67.html ), it's mainly intended as a language to write glue code so Java applets (where the real application logic would go) can interact with a webpage: > With JavaScript, an HTML page might contain an intelligent form that performs loan payment or currency exchange calculations right on the client in response to user input. A multimedia weather forecast applet written in Java can be scripted by JavaScript to display appropriate images and sounds based on the current weather readings in a region. A server-side JavaScript script might pull data out of a relational database and format it in HTML on the fly. A page might contain JavaScript scripts that run on both the client and the server. On the server, the scripts might dynamically compose and format HTML content based on user preferences stored in a relational database, and on the client, the scripts would glue together an assortment of Java applets and HTML form elements into a live interactive user interface for specifying a net-wide search for information. > "Programmers have been overwhelmingly enthusiastic about Java because it was designed from the ground up for the Internet. JavaScript is a natural fit, since it's also designed for the Internet and Unicode-based worldwide use," said Bill Joy, co-founder and vice president of research at Sun. "JavaScript will be the most effective method to connect HTML-based content to Java applets." This was all actually implemented. JavaScript functions could call Java applet methods and vice versa (see https://docs.oracle.com/javase/8/docs/technotes/guides/deploy/javafx_javascript.html ). Of course over time everyone abandoned applets because of all the security problems, and JavaScript became a good enough language to write application logic directly in it. Still, there's more meaning behind the name than it just being a cynical marketing move.
The story is somewhat more complicated than that and not amenable to a simple summary, because there are multiple entities with multiple motivations involved. Keeping it simple, the reason why the press release babbles about that is that that is corporate Netscape talking at the height of the Java throat-forcing era. Those of you who were not around for it have no equivalent experience for how Java was being marketed back then because no language since then has been backed by such a marketing budget, but Java was being crammed down our throats whether you like it or not. Not entirely unlike AI is today, only programmers were being even more targeted and could have been seeing more inflation-adjusted-dollar-per-person spend since the set of people being targeted is so much smaller than AI's "everyone in the world" target. This cramming did not have any regard for whether Java was a good solution for a given problem, or indeed whether the Java of that era could solve the problem at all . It did not matter. Java was Good. Good was Java. Java was the Future. Java was the Entire Future. Get on board or get left behind. It was made all the more infuriating for the fact that the Java of this time period was not very good at all; terrible startup, terrible performance, absolutely shitty support for anything we take for granted nowadays like GUIs or basic data structure libraries, garbage APIs shoved out the door as quickly as possible so they could check the bullet point that "yes, java did that" as quickly as possible, like Java's copy-of-a-copy of the C++ streaming (which are themselves widely considered a terrible idea and an antipattern today!). I'm not even saying this because I'm emotional or angry about it or hate Java today. Java today is only syntactically similar to Java in the 90s. It hardly resembles it in any other way. Despite the emotional tone of some of what I'm saying, I mean this as descriptive . Things really were getting shoveled out the door with a minimum of design and no real-world testing so that the Java that they were spending so much marketing money on could be said that yes! It connected to this database! Yes! It speaks XML! Yes! It has a cross-platform GUI! These things all barely work as long as you don't subject them to a stiff breeze, but the bullet point is checked! The original plan was for Java to simply be the browser language, because that's what the suits wanted, because probably that's what the suits were being paid to want. Anyone can look around today and see that that is not a great match for a browser language, and a scripting language was a better idea especially for the browser in the beginning. However, the suits did not care. The engineers did, and they were able to sneak a scripting language into the browser by virtue of putting "Java" in the name, which was enough to fool the suits. If my previous emotional text still has not impressed upon you the nature of this time, consider what this indicates from a post-modern analysis perspective. Look at Java. Look at Javascript. Observe their differences. Observe how one strains to even draw any similarities between them beyond the basics you get from being a computer language. Yet simply slapping the word "Java" on the language was enough to get the suits to not ask any more questions until much, much later. That's how crazy the Java push was at the time... you could slip an entirely different scripting language in under the cover of the incredible propaganda for Java. So while the press release will say that it was intended to glue Java applets, because that's what the suits needed to hear at that point, it really wasn't the case and frankly it was never even all that great at it. Turns out bridging the world between Java and Javascript is actually pretty difficult; in 2025 we pay the requisite memory and CPU costs without so much as blinking but in an era of 32 or 64 MEGA byte RAM profiles it was nowhere near as casual. The reality is that what Javascript was intended to be by the actual people who created it and essentially snuck it in under the noses of the suits is exactly what it is today: The browser scripting language. I think you also had some problems like we still have today with WASM trying to move larger things back and forth between the environments, only much, much more so. We all wish it had more than a week to cook before being shoved out the door itself, but it was still immensely more successful than Java ever could have been. (Finally, despite my repeated use of the term "suits", I'm not a radical anti-business hippie hacker type. I understand where my paycheck comes from. I'm not intrinsically against "business people". I use the term perjoratively even so. The dotcom era was full of bullshit and they earned that perjorative fair and square.)
What's become clear is we need to bring Section 230 into the modern era. We allow companies to not be treated as publishers for user-generated content as long as they meet certain obligations. We've unfortunately allowed tech companies to get away with selling us this idea that The Algoirthm is an impartial black box. Everything an algorithm does is the result of a human intervening to change its behavior. As such, I believe we need to treat any kind of recommendation algorithm as if the company is a publisher (in the S230 sense). Think of it this way: if you get 1000 people to submit stories they wrote and you choose which of them to publish and distribute, how is that any different from you publishing your own opinions? We've seen signs of different actors influencing opinion through these sites. Russian bot farms are probably overplayed in their perceived influence but they're definitely a thing. But so are individual actors who see an opportunity to make money by posting about politics in another country, as was exposed when Twitter rolled out showing location, a feature I support. We've also seen this where Twitter accounts have been exposed as being ChatGPT when people have told them to "ignore all previous instructions" and to give a recipe. But we've also seen this with the Tiktok ban that wasn't a ban. The real problem there was that Tiktok wasn't suppressing content in line with US foreign policy unlike every other platform. This isn't new. It's been written about extensively, most notably in Manufacturing Consent [1]. Controlling mass media through access journalism (etc) has just been supplemented by AI bots, incentivized bad actors and algorithms that reflect government policy and interests. [1]: https://en.wikipedia.org/wiki/Manufacturing_Consent
↙ time adjusted for second-chance
All the Way Down (futilitycloset.com)
 Top