Nothing Foreign Has Decided What We Are
Why Soft Skills, Connection, and Action Are the Keys to Meaningful Outcomes
“Nothing foreign has decided what we feel, what we live, or what we are.”
This isn’t an anti-AI post. In fact, it’s partially inspired by a conversation I had with an AI chatbot. They’re wonderful for the conversation topics I wouldn’t want to trap another human in (especially at 11 o’clock at night), like “why are systems and individuals are stuck in cycles of discovery without application” and “what is driving the devaluation of expertise and erosion of human connection”.
I recently posted about a company I freelance for not allowing AI in editing and how much I appreciated that they value human expertise. This sparked a few discussions in the DMs about what my value is. Or, in the case of one conversation, isn’t. Your skills are obsolete because AI can do it all.
It can be easy to resent AI when it appears to be devaluing the skills and expertise we’ve spent years, even decades, building. I find myself constantly adding to the way I consult and edit and the insight I provide so as to always be one step ahead of what AI has to offer, focusing on humanizing a process that was already human as much as possible. Because while my skills and knowledge are not, in fact, obsolete, they are no longer enough or as valued as they were before AI models were trained in the foundations. Before knowledge that once had to be gained through intentional study and practice became available to anyone who asks for it with a quick prompt.
It can often feel like something foreign has, in fact, decided what we are.
When the job you’ve held for 12 years is replaced by a machine, it’s easy to feel powerless to a system you didn’t design (which, incidentally, is not as unique to the AI-era as we sometimes think—it was a relatable enough concept in early 1990 to be used in the pilot episode of British sitcom One Foot in the Grave, in which the main character was forced into retirement after his receptionist job was “replaced by a box”).
But the one thing we humans have that AI will never accomplish is agency.
AI can replace skills, and it’s only going to get better at doing so. Blaming AI for that is a choice to let AI take more than it has to offer.
Taking a deeper look into the mechanisms that have allowed AI to rapidly infiltrate most aspects of our life reveals just how human that process has been… and how important humans will be in integrating new technology in a way that leads to actual outcomes. Not that this will be easy.
Productivity is, in many aspects of life, synonymous with volume. Humans have been replacing outcomes with output for decades. AI didn’t do this. Humans did. And then we developed AI to make it easier. Even in science, where discovery is supposed to be the point, we place value on how quickly we can get work done, how many publications we can get, how many grants… That publish or perish attitude is output, not outcome, focused. Do more. Discover more. Share more. And where AI often appears to be simplifying productivity (think generating endless content, automating research, creating tools), it’s also driving the expansion of very human issues that only human intervention can solve…
The Discovery Trap: Knowing Isn’t Doing
This shows up everywhere: humans are extremely inclined to gather knowledge and then do nothing with what they’ve learned.
You might see this in a friend who had a tree fall down in their back yard and instead of having it removed, decided to make it into a picnic table… but a year later the tree is still there and your friend is still watching YouTube videos on woodwork. Or maybe it’s the family member who’s read a whole bookcase worth of self-help books but hasn’t changed any of the behaviors that are keeping them feeling stuck. Or it could be you, reading a fitness book, purchasing new activewear and a gym membership, then never setting foot on a treadmill.
We love to discover. We suck at applying.
This is partially rooted in the way our education systems are set up. Education is very focused on thinking, acquiring knowledge, and demonstrating that we have learned enough at each stage to progress to the next. There’s very little application of what we learn as we go through the system, and depending on what it is that you study, it’s possible that even once you’ve finished your degrees, you might never actively apply what you’ve learned. Even if you think you’re applying it.
This isn’t limited to individuals. It’s a systemic issue as well. Think about biomarker studies, for example, that have a tendency to lead to endless promise but leave human needs unmet. How many labs are built around biomarker discovery but not around translating those biomarkers into meaningful diagnostic or therapeutic tools? There are a lot more biomarker studies than there are applications using biomarkers. So many promising discoveries never make it into reality.
The why behind this, from my observations (not particularly scientific ones, I must add) is extremely illogical, but that doesn’t make it not human. Quite the opposite, in fact. Humans aren’t logical beings. We want to be. We want everything to make sense. We want to know why. But on a very fundamental level, we’re not. Just try logically defining exactly why you like something. Usually, we just like it and come up with reasons why after we’ve noticed we like it. Which is fine. But it’s not logical, because we’re not, and here’s how we get stuck in the discovery trap: learning information about how to apply something feels the same as actually applying it. We trick ourselves into thinking that understanding the problem is the same as solving it.
If we read a self-help book and learn why we can’t communicate with our mother, we tend to then think we can communicate with her better without actively changing anything about our communication style because we know now… We’ll do things that actively make us sick because if we understand why they’re making us feel that way, we can somehow justify continuing to do them. If you ate something that made you feel like drinking a bottle of wine feels, you might be cautious about eating that thing again, but the same outcome doesn’t stop you drinking the wine.
The fact is that knowledge without action is just potential. It does not create change.
While AI can massively increase knowledge acquisition and production, it does not have the capacity to take action. We do not have the ability to create anything meaningful with AI unless we choose to apply what we learn from it. Our desire to gather and produce knowledge led us to create AI. Our capacity for agency gives us the ability to take action. And this is where you can continue to stand out in the AI era.
The culture we’ve created rewards output over outcomes, and AI, despite its potential, accelerates this trend. Breaking out of the discovery trap necessitates a focal shift toward outcomes, and obtaining meaningful outcomes means prioritizing human connection in a world that’s increasingly automated. Human expertise will never be obsolete. AI can drive innovation, but it’s up to us to ensure that this innovation serves the greater good—to prioritize outcomes that benefit people over outputs that benefit industries.
Output vs. Outcome: Expertise that Can’t be Devalued
The ability to gain knowledge is not as valuable as it once was because knowledge has become so accessible. This shift can feel unsettling, and naturally, there’s some resistance to it, especially among individuals who have dedicated years of their lives to becoming experts in things (I have some of this resistance myself), but resisting reality is futile, and the increased accessibility of knowledge isn’t a bad thing. It’s also an opportunity to expand what expertise is. What AI can’t do is think, strategize, and build on existing knowledge with new, actionable information. It’s built on what is already known. So where your job might previously have been to know, the future will be in application. It’s time to get out of the discovery trap.
The areas where you can stand out amidst all the AI are the ones that tech development isn’t necessarily focused toward. (Note that I’m not saying any of this will be easy). If you look at the overall picture of where AI and technological development fits within how companies operate, how funding for research is prioritized (or not), and how progress is measured, there’s a clear disconnect between the goals of development and the needs of the population.
Tech development companies are focused on profit and, while there needs to be sufficient demand in the market for that to happen, the needs being met are not necessarily the ones that are best prioritized (there are many things we need more than an AI tool that can write for us). Government funding is typically allocated according to what positions a country best on the global scale, whether that's defense, biosecurity, pandemic preparedness, or something less actually useful like 'looking better than China'. AI, associated technologies, and personalized medicine are absolutely 'trending' in the medical field, but this is largely driven by global issues and the rapid emergence of new technologies; perhaps less so by what humans need.
The massive increase in AI and technology is driven in part by the desire to make more money through building autonomous systems that reduce human involvement. Human-in-the-loop strategies are hugely cost-saving and often quite effective (though, in my opinion, not ready for healthcare). But developing new models that show progress in tech and innovation often benefit the tech and innovation industry a lot more than they benefit the population overall because the primary goal is increased output (revenue and convenience). In research, projects often end up focusing on quite a narrow aspect of, for example, personalized medicine and general medical development. Research that cannot be integrated with AI or doesn't fit the personalized medicine framework might be deprioritized, but the needs of the people that type of research benefits aren't going anywhere. Neither is the need to translate research into clinical practice. If all the tech is used at the research and development stage and not needed once the project is ready for translational studies, too much focus on tech applications in funding could actually limit the ability to translate research findings. That’s not a beneficial outcome…
What is a beneficial outcome is identifying where there are gaps in our ability to make meaningful change as a result of how we use technology. Where there's temptation to pivot your research to focus on areas that funders are currently interested in or to develop policies that integrate the most recent technology to replace or otherwise reduce the workforce, you have a choice to take a step back and evaluate whether the output doing so will result in actually produces an outcome that represents progression. Obtaining more of something is not the same as improving it. When tech integration or development into your processes is driven by things other than benefitting patients, communities, etc., it becomes meaningful only at a surface level. Taking it deeper and converting productivity into tangible benefit takes strategy and expertise that AI doesn't have and likely will not have any time soon.
Your skills in critical analysis, strategy, and communicating in a way that gets people on board with taking action instead of getting stuck in a loop of increasing outputs just because they can... those are more valuable now than they ever have been, and the increased access to knowledge with AI development in combination with these skills creates an advantage for anyone who can translate this increased production and knowledge gathering power into beneficial change.
Breaking out of the discovery trap and shifting from output to outcome requires connection. Without understanding the true needs of the people we aim to serve, even the best strategies will fall short.
Human Connection: The Disappearing Link (that we can get back)
It’s not possible to create anything that benefits humanity without knowing where the true needs are. This is one aspect of knowledge that AI and technology has reduced the accessibility of. The reduction in human connection as more and more things have moved online, especially post-COVID, has substantially reduced human desire and ability to connect on a wide (and often unintentional) scale. Remote everything is sold to us as freedom, but it’s dictated by the systems that have been developed, not by “what we feel, what we live, or what we are.”
We can’t know what matters without connection, and we can’t know how to apply knowledge in a beneficial way if we don’t know what it needs to be applied to to achieve these outcomes. Community can’t exist without intentional communication and imperfect, real-world human interaction. This is where understanding, which AI can’t give us, comes in. Connection with people comes from listening and engaging and building trust. When we build community, we naturally see the gaps—the questions that aren’t being asked, the needs that are missed in study designs that don’t consider them.
Intentional human connection is a critical function of any system that is going to result in meaningful outcomes. Abandoning that by going through every aspect of life behind a screen will keep us stuck in the discovery trap and producing things that might seem impressive but don’t create anything lasting or meaningful.
The people who still know how to connect, listen, empathize, respond (instead of react) are the ones best positioned to take the output-focused benefits of AI and translate them into action that actually benefits others. Soft skills like strategic empathy, communication, resilience, and adaptability bridge the gap between knowledge and action. They’re the foundation of human connection and the key to creating meaningful outcomes. If you have them, you can also teach them to others... It’s not a case of dividing people into “those who can” and “those who can’t”: these abilities are called soft skills for a reason. All skills can be learned (check out
(@graceforpersonalityhires on TikTok) if you want to start focusing more on developing your soft skills).Building soft skills builds connection, which reveals unmet needs. Technology provides potential solutions. You have the capability to take action. Bringing all of these things together is how you stand out and make an actual difference.
And I think you will.
Because “nothing foreign has decided what we feel, what we live, or what we are.”