Who else cannot be distilled into skill?
Article | Sleepy.md
Unfortunately, in this day and age, the more wholeheartedly you work, the easier it is to distill yourself into a skill that can be replaced by AI.
These days, the hot search lists and media channels have been flooded with "colleague's skill." As this matter continues to ferment on major social media platforms, the public's focus is almost inevitably enveloped by grand anxieties such as "AI layoffs," "capital exploitation," and "the digital immortality of the working class."
While these are indeed anxiety-inducing, what makes me most anxious is a line in the project README document:
"The quality of raw material determines the quality of skill: It is recommended to prioritize collecting long-form content written proactively by the person> decision-making responses> daily messages."
Those most easily perfectly distilled by the system, pixel-perfectly reconstructed, are precisely those who work the most diligently.
It's those who, after every project concludes, still sit at their desks to write a post-mortem document; those who, when faced with disagreements, are willing to spend half an hour typing a long-form response in a chat box, candidly analyzing their decision-making logic; those who are extremely responsible, meticulously entrusting all work details to the system.
Diligence, once the most admired virtue in the workplace, has now become a catalyst for accelerating workers' transformation into AI fuel.
The Drained Worker
We need to redefine a word: context.
In everyday context, context is the background of communication. But in the world of AI, especially in the world of those rapidly growing AI Agents, context is the roaring engine's fuel, the pulsating blood, the only anchor that allows models to make precise judgments in the chaos.
An AI stripped of context, no matter how amazing its parameter count, is nothing more than an amnesiac search engine. It cannot recognize who you are, cannot grasp the undercurrent hidden beneath the business logic, and has no way of knowing the long tug-of-war and trade-offs you experienced on this network woven from resource constraints and interpersonal dynamics when finalizing a decision.
And the reason why "colleague's skill" has caused such a huge stir is precisely because it coldly and precisely locked onto that mountain of hoarded high-quality context — modern enterprise collaboration software.
Over the past five years, the Chinese workplace has undergone a quiet yet grueling digital transformation. Tools like Feishu, DingTalk, Notion, and others have become vast repositories of corporate knowledge.
Take Feishu as an example. ByteDance has publicly stated that the number of documents generated internally every day is massive. These densely packed characters faithfully encapsulate every brainstorm, every heated meeting, and every strategic compromise of over 100,000 employees.
This level of digital penetration far exceeds any previous era. Once upon a time, knowledge was warm, lurking in the minds of veteran employees, drifting through casual chats in the pantry. Now, all human wisdom and experience have been forcibly drained of moisture, ruthlessly precipitated in the cold server matrix in the cloud.
In this system, if you don't write documents, your work cannot be seen, and new colleagues cannot collaborate with you. The efficient operation of modern enterprises is built on the foundation of every employee day by day offering contextual contributions to the system.
Diligent workers carry diligence and goodwill, unreservedly laying bare their thinking paths on these cold platforms. They do this to ensure the team's gears mesh smoothly, to strive to prove their value to the system, and to desperately carve out a place for themselves within this intricate commercial behemoth. They are not voluntarily surrendering themselves; they are simply awkwardly and diligently adhering to the survival rules of the modern workplace.
Yet, ironically, this contextual information left for interpersonal collaboration has become the perfect fuel for AI.
Feishu's admin panel has a feature that allows super administrators to bulk export members' documents and communication records. This means that the project reviews and decision-making logic you spent three years working on during countless late nights can be easily packaged into a lifeless compressed file with just an API call in a matter of minutes.
When Humans Are Dimensionally Reduced to APIs
With the rise of "colleague.skill," some extremely uncomfortable derivatives have started to appear on GitHub's Issues section and various social media platforms.
Some have created "ex.skill," attempting to feed AI with chat records from WeChat over the past few years so that it can continue to argue or be tender in that familiar tone; others have created "unrequited love.skill," reducing untouchable palpitations to a cold interpersonal sandbox, repeatedly deducing probing dialogues, step by step seeking the optimal emotional outcome; and still others have created "paternalistic boss.skill," chewing on oppressive PUA rhetoric in the digital space in advance, constructing a sad psychological defense line for themselves.

The use cases of these skills have completely transcended the realm of work efficiency. Unconsciously, we have become accustomed to wielding the cold logic of tool treatment, dissecting and objectifying those once fleshy, lively individuals.
German philosopher Martin Buber once proposed that the foundation of human relationships boils down to two radically different modes: the “I-Thou” and the “I-It.”
In the encounter of the “I-Thou,” we transcend prejudices and regard the other as a complete and dignified living being to gaze upon. This bond is open without reserve, full of vibrant unpredictability, and precisely because of its sincerity, it appears particularly fragile; however, once plunged into the shadow of the “I-It,” the living person is reduced to an object that can be dismantled, analyzed, categorized, and labeled. Under this extremely utilitarian scrutiny, the only thing we care about is “What is the use of this thing to me?”
The emergence of products like “ex-skill” signifies that the tool rationality of the “I-It” has thoroughly invaded the most intimate emotional domain.
In a genuine relationship, a person is three-dimensional, full of wrinkles, constantly flowing with contradictions and nuances, and their reactions vary based on specific circumstances and emotional interactions. Your ex may react very differently to the same sentence when waking up in the morning compared to working late at night.
However, when you distill a person into a skill, what you strip away is merely the residual part of their functionality that happened to be “useful” to you and could “benefit you” in that specific bond. The once warm and self-experiencing individual is completely drained of their soul in this cruel purification, alienated into a “functional interface” that you can plug and play with at will.
It must be acknowledged that AI did not invent this chilling coldness out of thin air. Before AI emerged, we were already accustomed to labeling others, precisely measuring the “emotional value” and “social network weight” of each relationship. For example, in the dating market, we quantify a person’s attributes into grids; in the workplace, we classify colleagues as “capable” or “slackers.” AI just made this implicit, functional extraction between individuals blatantly explicit.
People have been flattened, leaving only that facet of “what is useful to me.”
Electronic Encapsulation
In 1958, Hungarian-British philosopher Michael Polanyi published “Personal Knowledge.” In this book, he introduced a highly penetrating concept: tacit knowledge.
In a famous dictum, Polanyi stated, "We know more than we can tell."
He gave an example of learning to ride a bicycle. A skilled cyclist, riding effortlessly, can perfectly balance in every gravity tilt, but he cannot precisely describe to a novice the subtle intuition of that moment in words or dry physics formulas. He knows how to ride, but he cannot articulate it. This type of knowledge that cannot be encoded or spoken is called tacit knowledge.
The workplace is full of such tacit knowledge. A senior engineer, when troubleshooting a system failure, may quickly pinpoint the issue by glancing at the logs, but he would find it challenging to document this "intuition" built upon thousands of trial-and-error instances. An excellent salesperson may suddenly fall silent at the negotiation table, and the sense of pressure and timing that silence brings is something no sales manual can capture. An experienced HR professional may, just by observing a candidate's half-second of avoiding eye contact, sense the exaggerations on the resume.
What "Colleague.skill" can extract is only that which has already been written down or spoken—explicit knowledge. It can scrape your postmortem documents but cannot capture your struggles while writing them; it can replicate your decision responses but cannot replicate the intuition behind your decision-making.
What the system distills is always just a person's shadow.
If the story were to end here, it would be nothing more than another poor imitation of humanity by technology.
However, when a person is distilled into a skill, this skill does not remain static. It is used to reply to emails, write new documents, make new decisions. In other words, these AI-generated shadows begin to generate new contexts.
And these AI-generated contexts are then deposited in Feishu and DingTalk, becoming the training materials for the next round of distillation.
As early as 2023, a research team from the University of Oxford and the University of Cambridge jointly published a paper on "model collapse." The research indicated that when an AI model is iteratively trained using data generated by other AIs, the distribution of the data becomes increasingly narrow. Those rare, marginal but highly authentic human traits are rapidly erased. After just a few generations of training on synthetic data, the model completely forgets the long-tail, complex real human data and instead outputs extremely mediocre and homogenized content.
In 2024, Nature also published a research paper stating that training future generations of machine learning models on AI-generated datasets would severely taint their outputs.

This is like those meme images circulated on the internet, originally a high-resolution screenshot that has been shared, compressed, and reshared by countless people. With each spread, some pixels are lost, and some noise is added. In the end, the image becomes blurry, digitally impasted.
When real human context with implicit knowledge is squeezed dry, and the system can only train itself on impasted shadows, what will be left in the end?
Who Is Erasing Our Tracks
What's left is only the right kind of nonsense.
When the river of knowledge dries up into an endless regurgitation and self-consumption of AI by AI, everything the system exhales will become extremely standard, extremely safe, but also irredeemably hollow. You will see countless perfectly structured reports, numerous flawlessly crafted emails, yet they will lack any human touch, devoid of any truly valuable insight.
The great defeat of knowledge is not because the human brain has become dull; the real tragedy is that we have outsourced the right to think and the responsibility to leave context to our own shadows.
Days after the explosion of "colleague.skill," a project called "anti-distill" quietly emerged on GitHub.
The author of this project did not attempt to attack big models or write any grand manifestos. They simply provided a small tool to help workers auto-generate seemingly reasonable but actually filled with logical noise invalid long texts on Feishu or DingTalk.
His purpose was simple: to hide his core knowledge before being distilled by the system. Since the system likes to fetch "actively written long texts," give it a bunch of nutritionless gibberish.
This project did not catch fire like "colleague.skill"; it even seemed a bit insignificant and feeble. Using magic to defeat magic still fundamentally revolves around the game rules set by capital and technology. It cannot change the trend of the system relying more and more on AI and increasingly overlooking real humans.
But this does not prevent this project from being the most tragically poetic and profoundly metaphorical scene in the entire absurd drama.
We work extremely hard to leave traces in the system, write detailed documents, make meticulous decisions, trying to prove our past existence in this vast modern corporate machine, proving our worth. Unaware that these very serious traces will eventually become the eraser that wipes us out.
But looking at it from a different perspective, this may not necessarily be a complete deadlock.
Because what the eraser wipes away is always just the "past you." A skill packaged into a file, no matter how sophisticated its scraping logic, is essentially just a static snapshot. It is frozen in that exported moment, relying only on stale nutrients, endlessly spinning in established processes and logics. It lacks the instinct to face unknown chaos and certainly does not possess the ability to self-evolve through real-world setbacks.
When we hand over those highly standardized, formulaic experiences, we also free up our own hands. As long as we continue to reach outward and constantly break and reconstruct our cognitive boundaries, that shadow resting in the cloud will forever only follow in our footsteps.
A human is a fluid algorithm.
You may also like

Consumer-grade Crypto Global Survey: Users, Revenue, and Track Distribution

Prediction Markets Under Bias

Stolen: $290 million, Three Parties Refusing to Acknowledge, Who Should Foot the Bill for the KelpDAO Incident Resolution?

ASTEROID Pumped 10,000x in Three Days, Is Meme Season Back on Ethereum?

ChainCatcher Hong Kong Themed Forum Highlights: Decoding the Growth Engine Under the Integration of Crypto Assets and Smart Economy

Why can this institution still grow by 150% when the scale of leading crypto VCs has shrunk significantly?

Anthropic's $1 trillion, compared to DeepSeek's $100 billion

Geopolitical Risk Persists, Is Bitcoin Becoming a Key Barometer?

Annualized 11.5%, Wall Street Buzzing: Is MicroStrategy's STRC Bitcoin's Savior or Destroyer?

An Obscure Open Source AI Tool Alerted on Kelp DAO's $292 million Bug 12 Days Ago

Mixin has launched USTD-margined perpetual contracts, bringing derivative trading into the chat scene.
The privacy-focused crypto wallet Mixin announced today the launch of its U-based perpetual contract (a derivative priced in USDT). Unlike traditional exchanges, Mixin has taken a new approach by "liberating" derivative trading from isolated matching engines and embedding it into the instant messaging environment.
Users can directly open positions within the app with leverage of up to 200x, while sharing positions, discussing strategies, and copy trading within private communities. Trading, social interaction, and asset management are integrated into the same interface.
Based on its non-custodial architecture, Mixin has eliminated friction from the traditional onboarding process, allowing users to participate in perpetual contract trading without identity verification.
The trading process has been streamlined into five steps:
· Choose the trading asset
· Select long or short
· Input position size and leverage
· Confirm order details
· Confirm and open the position
The interface provides real-time visualization of price, position, and profit and loss (PnL), allowing users to complete trades without switching between multiple modules.
Mixin has directly integrated social features into the derivative trading environment. Users can create private trading communities and interact around real-time positions:
· End-to-end encrypted private groups supporting up to 1024 members
· End-to-end encrypted voice communication
· One-click position sharing
· One-click trade copying
On the execution side, Mixin aggregates liquidity from multiple sources and accesses decentralized protocol and external market liquidity through a unified trading interface.
By combining social interaction with trade execution, Mixin enables users to collaborate, share, and execute trading strategies instantly within the same environment.
Mixin has also introduced a referral incentive system based on trading behavior:
· Users can join with an invite code
· Up to 60% of trading fees as referral rewards
· Incentive mechanism designed for long-term, sustainable earnings
This model aims to drive user-driven network expansion and organic growth.
Mixin's derivative transactions are built on top of its existing self-custody wallet infrastructure, with core features including:
· Separation of transaction account and asset storage
· User full control over assets
· Platform does not custody user funds
· Built-in privacy mechanisms to reduce data exposure
The system aims to strike a balance between transaction efficiency, asset security, and privacy protection.
Against the background of perpetual contracts becoming a mainstream trading tool, Mixin is exploring a different development direction by lowering barriers, enhancing social and privacy attributes.
The platform does not only view transactions as execution actions but positions them as a networked activity: transactions have social attributes, strategies can be shared, and relationships between individuals also become part of the financial system.
Mixin's design is based on a user-initiated, user-controlled model. The platform neither custodies assets nor executes transactions on behalf of users.
This model aligns with a statement issued by the U.S. Securities and Exchange Commission (SEC) on April 13, 2026, titled "Staff Statement on Whether Partial User Interface Used in Preparing Cryptocurrency Securities Transactions May Require Broker-Dealer Registration."
The statement indicates that, under the premise where transactions are entirely initiated and controlled by users, non-custodial service providers that offer neutral interfaces may not need to register as broker-dealers or exchanges.
Mixin is a decentralized, self-custodial privacy wallet designed to provide secure and efficient digital asset management services.
Its core capabilities include:
· Aggregation: integrating multi-chain assets and routing between different transaction paths to simplify user operations
· High liquidity access: connecting to various liquidity sources, including decentralized protocols and external markets
· Decentralization: achieving full user control over assets without relying on custodial intermediaries
· Privacy protection: safeguarding assets and data through MPC, CryptoNote, and end-to-end encrypted communication
Mixin has been in operation for over 8 years, supporting over 40 blockchains and more than 10,000 assets, with a global user base exceeding 10 million and an on-chain self-custodied asset scale of over $1 billion.

$600 million stolen in 20 days, ushering in the era of AI hackers in the crypto world

Vitalik's 2026 Hong Kong Web3 Summit Speech: Ethereum's Ultimate Vision as the "World Computer" and Future Roadmap

On the same day Aave introduced rsETH, why did Spark decide to exit?

Full Post-Mortem of the KelpDAO Incident: Why Did Aave, Which Was Not Compromised, End Up in Crisis Situation?

After a $290 million DeFi liquidation, is the security promise still there?

ZachXBT's post ignites RAVE nearing zero, what is the truth behind the insider control?







