Decentering Men

Unbeknownst to me, given my demographic In 2025, the phrase “decentering men” has gone from niche feminist theory to mainstream mantra, slipping into everyday conversations, TikTok stitches, and even celebrity interviews. I was not even familiar with the word or the concept it tries to clarify. Ran into this while reading something unrelated a couple of days ago.

It was coined by writer Sherese (Charlie) Taylor in 2019, the concept describes a deliberate refusal to let men or the cultural expectation of male approval, remain at the center of a woman’s thoughts, decisions, self-worth, or life plan. From what I was able to glean, this is not a catchy slogan dreamed up for viral soundbites (as it seems to be at first blush); instead it is a political response to the quiet rage many women feel after years of shrinking themselves to fit the shape patriarchy carved out for them. Taylor calls it the exhaustion of living at eighty-five percent, always waiting for permission to take up full space.

The exhaustion has reached a breaking point. The reversal of reproductive rights, the mainstreaming of incel rhetoric, the resurgence of tradwife aesthetics dressed up as empowerment have had a snowball effect of sorts. In this context, women are watching their safety, autonomy, and labor be treated as negotiable. At the same time, many now have the economic and social freedom previous generations could only dream of. Singlehood is no longer an automatic financial death sentence, and that shift has forced a reckoning: if we can finally survive and even thrive without male partnership, why are we still taught that our power comes from proximity to men? If financial independence is the north star for a woman then a man does not need to have any role in it at all. This has been true for a long time for many women though it can be argued that more women are now able to have such independence than was possible before.

Decentering men, then, is the practice of unlearning that lie that men are the center of the universe and critically required in a woman's life. It means interrogating every inherited belief that a boyfriend, husband, or ring is the ultimate prize, the final destination, the proof that you are worthy. It means noticing how often women orient their schedules, style, ambitions, and emotions around being chosen, and choosing instead to orient around themselves. The phrase resonates because it names something millions of women have felt but never had language for: the relief of waking up and realizing the day does not have to revolve around what some man might think.

A common misunderstanding is that decentering men requires swearing off dating, sex, or love with men entirely. It does not. You can still desire partnership, still fall in love, still post a blurry hand on a steering wheel if that’s your vibe. The difference is that the relationship no longer gets to colonize the rest of your life. You stop accepting mediocre treatment because “at least he picked me,” stop staying in draining dynamics to avoid the stigma of being single, stop treating exes as evidence that good love doesn’t exist. The bar simply rises: a man is welcome in your world only if he adds to a life that is already full, never if he requires you to make yourself smaller.

This is why the trend feels so threatening to some and so liberating to others. When women stop treating male attention as the sun around which their planets must orbit, entire systems built on female self-sacrifice begin to wobble. Dating becomes less about auditioning for validation and more about mutual enrichment. Friendships, creativity, rest, and ambition reclaim the center stage they were always meant to have.

I hope the trend is not inclusive of fathers, brothers and male friends and mentors a woman has in her life. That would be a bit of bathwaterism in many cases.

Interviewing Robots

I have interviewed atleast three candidates in the last six months that had AI assisting them in real-time during the interview. It left me feeling like I was talking to a robot not a human. It does not surprise me that artificial intelligence is fundamentally reshaping the job search process, prompting many companies to revert to an old-school hiring solution: the in-person interview. As virtual interviews became ubiquitous, especially for remote and technical roles, employers noticed a rise in candidates using AI to cheat, such as feeding off-screen answers into coding tests or, in rare cases, even using deepfakes to impersonate job seekers online. High-profile firms like Cisco, McKinsey, and Google have now started requiring face-to-face meetings for some positions, aiming to ensure skills are real and credentials are genuine.

The current climate is described as an “AI arms race,” with both sides, job seekers and employers, leveraging technology in escalating ways. Companies use advanced software to filter applicants and automate the hiring funnel, while frustrated candidates employ AI tools to churn out optimized applications and ace digital interviews, sometimes crossing into outright fraud. The increasing sophistication of AI-generated fakery, combined with fresh warnings from the FBI about international scams, has made identity and skill verification more urgent than ever.

To combat these risks, employers are not only returning to in-person interviews but also investing in biometric identification, collaboration with platforms like Clear, and vigilant screening for signs of digital manipulation. Recruiters report that even the mere mention of an in-person interview can deter would-be scammers, and that up-close interactions often reveal qualities and potential red flags that video screens frequently conceal. The widespread adoption of these safeguards marks a shift in the balance between digital convenience and the human factors required for genuine trust.

This evolution reflects a new reality: as AI blurs boundaries between real and fake, face-to-face meetings reclaim their old role as the gold standard for authenticating candidates and cultural fit. The hiring process is coming “full circle,” exposing both the risks of over-reliance on algorithms and the renewed value of direct human connection in an increasingly mediated world. For candidates and recruiters alike, the message is clear, authenticity and good judgment matter more than ever in the age of AI-driven job applications.

Winning Big

J is the demographic is Morgan Stanley study talks about and as a mother, I'd love for her to be in in the other 55% of the population, but only time will tell if that will be true. The study notes that women’s rising influence in the global economy is reshaping both corporate practice and consumer markets. Since 2010, the percentage of women executives has grown steadily across developed regions, and in Asia, participation has doubled. Public conversations around gender diversity, wage gaps, and women’s evolving roles in the workplace have created momentum not just for greater equality, but also for more nuanced business strategy and governance. Morgan Stanley’s research demonstrates that companies embracing gender diversity tend to see stronger, more resilient performance fueling both culturally progressive and bottom-line gains. Those things seem net-positive for women, atleast at first glance.

But there is a catch. A key driver of this shift is the increasing number of single, working-age women in the U.S. and other developed markets. By 2030, 45% of American women aged 25-44 are expected to be single, the largest share ever, up from 41% in 2018. This trend is propelled by women delaying marriage and childbirth or opting out entirely, factors which support higher labor force participation and a narrowing gender pay gap. While caregiving responsibilities still impact earnings, a growing share of women are primary breadwinners in their households and are earning bachelor’s degrees at rates surpassing men. 

This profound demographic change is reflected in spending power: women contribute an estimated $7 trillion to U.S. GDP, and are pivotal consumers in most households. Single women, in particular, outspend the average household on apparel, personal care, eating out, and even luxury goods. Their increased earning power is set to benefit sectors that cater to their preferences, and their rising financial independence is a strong tailwind to broader economic growth.

On a global scale, the correlation between gender-diverse organizations and stock outperformance is compelling. Companies with the highest levels of gender diversity (measured by Morgan Stanley’s Holistic Equal Representation Score, or HERS) have outperformed their less diverse peers across every major developed region, even after adjusting for factors like size and risk. While the relationship between diversity and returns is complex, the evidence suggests that gender inclusion is both a reflection of forward-thinking corporate culture and a catalyst for superior outcomes. As women’s roles in leadership and consumer markets continue to expand, their impact on economic, cultural, and financial landscapes is likely to deepen in the years ahead.

As I read this I was not sure how to feel about the big wins for women given what has made is possible.

Writing Nook

Readers are often curious about where writers they love make their magic. So no surprise that I found this series of pictures of writers who've won the Booker Prize write. There is something vulnerable and intimate about allowing the world to see these spaces. They vary from warm, cozy, spartan, to busy and everything in between. I haven't read these writers but it would be interesting to see if their styles mirror the spaces where they work from. Seeing these pictures reminded me of a lovely essay I'd read some time back about a foundational book from childhood that shapes how a writer sees the world and what they end up caring about. The connection between the nooks and the is essay is not obvious but it made sense.

There seem to be dozens of extra chairs. It’s striking that the volume of the rooms was a problem for the very rich. The furniture available didn’t scale to the size of the rooms, so you just had to put more of it in them. My mother, I remember, was walking to my right, the first time I entered that red room, and thought about the problem of the furniture. I must have remembered the room so clearly because it’s the room where I had an idea I’ve taken with me—a thought I have re-thought, a thought that has remained mine. 

I think this is important: memories and ideas happen in a place. An essay is a place for ideas; it has to feel like a place. It has to give one the feeling of entering a room. 

The writing nook is a place for ideas and feels like one. Even when things might be out of order or not how you imagine symmetry and balance would look like, it works because it is where ideas are churned and made into things that you read, love and remember. That is what draws the viewer into the photos. 

Right Time

My first thought when reading this article was about women who don't end up finding the right partner soon enough and are past child-bearing age when they finally do. Everything finally comes together, they now have the means to take several years off to be a full-time mother, have a partner who is eager and willing to do their part. The stars have finally aligned. Maybe technologies like this can give them a shot at having it all.

In a landmark study published in Nature Communications, scientists at Oregon Health and Science University have successfully created human eggs from skin cells, which were then fertilized in the lab to form early-stage embryos. This breakthrough, while still preliminary, suggests that one day, eggs could be generated for people facing infertility, such as older women, those affected by cancer, or even same-sex couples wishing to have genetically related children. The technique borrows from the same scientific principles that enabled the cloning of Dolly the sheep, involving the transfer of a skin cell nucleus into a donor egg stripped of its own nucleus.

Despite the excitement, the process remains far from ready for clinical use. Most resulting embryos showed chromosomal abnormalities and were not able to develop past early stages, with only a small percentage reaching the blastocyst phase typically needed for IVF. Scientists are still working to address the challenge of reducing chromosomes properly during egg creation, a step vital for healthy embryo development. The team has dubbed their process "mitomeiosis" and is continuing to refine techniques for chromosome pairing and segregation.

The wider implications of this work are enormous. If perfected, in vitro gametogenesis (IVG) could revolutionize fertility treatment, making it possible for individuals to have children at any age or after losing fertility for medical reasons. It also raises profound ethical questions: from the possibility of “embryo farming,” to screening for genetic traits, to the use of gene editing for disease prevention or trait selection. Experts note that this technology could be more than a decade away from safe and legal human application, with significant regulatory and societal debates ahead.

For now, this research offers new tools for understanding human reproduction and for studying errors in meiosis, the process by which eggs naturally halve their chromosomes, a process critical for fertility that becomes error-prone with age. While many hurdles remain, this innovation represents a bold step towards the future of reproductive science, with vast potential and equally important questions about how and when it should be used.

Statis Coming

I found this article on the value of discomfort to brand useful in understanding my own relationship with comfort and of some other folks that naturally came to mind. The author argues, for leading brands, discomfort isn’t just a risk, it’s an economic necessity. The article uses vivid natural analogies and business examples to argue that comfort leads to entropy, mediocrity, and stagnation. Like the ponderosa pine relying on fire or wildflowers needing abrasion to spread, real growth in business is born from upheaval rather than safety. Too many companies chase incremental improvement, confusing harmony and consensus with long-term success, while true vitality often comes from moments that make leaders and teams “wince.”

Complacency is framed as the “comfort trap,” where routines and best practices lull businesses into a false sense of security. The author points out that market volatility and disruption punish those who coast. Instead of self-imposed stasis, businesses should deliberately invite discord, risk, and the kind of creative destruction economist Joseph Schumpeter championed, even if it means confronting sacred cows or launching bold ideas before they’re fully polished.

Humphris highlights the distinction between “capability” and “optionality.” In uncertain times, many organizations slash investments in R&D or experimentation, doubling down on what’s familiar and “safe.” This is depicted as a long-term mistake, as it erodes an organization’s ability to adapt or seize future opportunities. The concept of “optionality”, building antifragile systems and a tolerance for failure, emphasizes the payoff of embracing noise, ambiguity, and failed experiments in the pursuit of the few breakthroughs that can deliver outsized results.

The author urges brands to “choose the fire,” to walk straight into discomfort by backing daring ideas, divisive strategies, and innovations that aren’t guaranteed to succeed. In a marketplace awash with sameness and consensus, real value is found at the edge, where risk and discomfort sharpen thinking, foster originality, and set apart those willing to endure the “ordeal of growth.” If every decision feels comfortable and everyone agrees, the warning is clear: you’re already falling behind.

I have gone out of my comfort zone and done did not at all feel safe. The outcomes prove that was the right choice. But there was a age and stage of life for that. I don't feel nearly as eager to jump into fire anymore. I see that mirrors how many others I know, folks that once took incredible risks and simply won't anymore. Maybe we've let statis take over to our detriment.

 

Being Visible

Publishers have been batting a variety for problems for a while now and that has implications for those who create content and advertisers who hope to catch the attention of those who consume it. It was interesting reading this article even if focused on UK publications because the trends are broader. Online traffic data for major UK city news websites in August 2025 shows a landscape of sharply contrasting fortunes, according to the latest IPSOS figures. While flagship brands like the Manchester Evening News (MEN) saw a 16.9% surge in monthly unique users compared to the previous year, reaching over 9.4 million, other leading city news sites experienced either rapid growth or notable declines. Audience composition is shifting as digital readers become pickier and local engagement ebbs and flows.

Some sites saw remarkable gains: Liverpool Echo grew by 19%, Bristol Post by 32%, and Leeds Live more than doubled its audience year-on-year, up 151%. Smaller regional titles like News & Star in Carlisle led in engagement, clocking over 8 minutes per user, far above the big cities’ averages. Meanwhile, many once-dominant sites, such as Yorkshire Evening Post and Gloucestershire Live, saw double-digit drops in unique users, revealing growing volatility and a competitive struggle for digital attention.

Engagement rather than raw traffic is emerging as a key metric. While high-traffic brands still rule the charts, strong user engagement is now a differentiator for smaller outlets whose average visitor spends far longer on site than at the national giants. The competition for local digital loyalty is intense, with some regional titles making big strategic gains even as rivals falter.

Behind the numbers, a more subtle trend is at play: “untagged” sites, which don’t use IPSOS’s analytics code, are growing more volatile, with traffic swings that defy easy explanation. This new normal means that winning in the big-city news game requires more than just chasing raw clicks. It calls for granular focus on loyal readers, innovation in local coverage, and ongoing adaptation as the digital news environment becomes fiercer and less predictable with every passing month.

The contrasting fortunes and shifting engagement patterns for big-city and local news websites are part of a global trend. According to multiple industry reports and analyses, news site traffic overall is on the decline in many countries, with sharp drops for traditional news websites and newspapers, both large and small. Engagement is becoming a crucial metric, as audiences increasingly prefer brief, real-time, or interactive formats, often served through social media, video platforms, or AI-driven news aggregators rather than directly visiting news websites.

Across six continents, younger audiences in particular now source most of their news via platforms like TikTok, YouTube, or Instagram, shifting traffic away from publisher-controlled sites. While some regional publications have succeeded in growing traffic through strategies like live updates or deepening local engagement, the overall environment is highly volatile, and many major brands have seen year-over-year declines in both visits and user loyalty. In some top 50 global news sites, as few as one-third reported annual growth, while others saw double-digit drops.

The rise of personalized news feeds like Google Discover has further shifted the landscape, with aggregators now playing a key role in shaping what audiences see, often at the expense of direct visits to news sites. This has upended established SEO strategies and made publishers more dependent on algorithms, which can redirect massive amounts of traffic based on shifting recommendations. In the U.S. and elsewhere, the decline of local news is especially pronounced; the loss of newsrooms and reduced staffing have created “news deserts” and increased audience fragmentation.

Ultimately, local and big-city news websites everywhere face the challenge of maintaining relevance and revenue as audience attention fragments and migrates. The ability to foster engagement, innovate in coverage and delivery, and pivot quickly to platform and algorithm changes is more important than ever in 2025’s global news ecosystem.



Em Dashing

This essay reflects on how a relatively recent shift in language has become a surprisingly lively battleground: the use of the em dash in AI-generated writing, notably by ChatGPT, has become a signal to readers that a passage might have been produced by a machine. While some online commentators argue that humans never use this punctuation, experienced writers and editors protest, pointing to literary greats from Dickens to Dickinson who filled their sentences with dashes, arguing that the em dash evokes the rhythms of natural speech and thought better than many alternatives. The debate reveals not just a changing attitude toward punctuation, but a deeper confusion about what writing even is, and who (or what) it’s for as AI increasingly participates in our written conversations.

This linguistic skirmish is emblematic of a broader transformation: in the digital age, vast swathes of what used to be spoken conversation have migrated onto screens. Where writing once meant carefully-crafted, edited, and published text, today it also encompasses emails, text messages, and endless streams of posts, DMs, and chat replies. The majority of a modern person’s “writing” is now effectively informal speech, quick, reactive, and more concerned with immediacy than syntax or literary craft. This everyday writing has its own gorgeous expressiveness, playful forms, and relaxed standards, but it stands in tension with bookish conventions.

Large language models like ChatGPT train on mountains of traditionally composed prose and old print matter, then mimic that style, right down to the nuanced use of em dashes, when asked to generate responses. The spectacle of digital assistants reflecting our literary past back at us, using “outdated” punctuation and orthography, sometimes seems uncanny or even robotic to readers more accustomed to the fast-and-loose typographical style of real-time digital life. We’re left in a strange position: the machines are conserving a literary tradition that digital natives are happy to leave behind.

This punctuation debate is ultimately about more than dashes, it’s evidence of an epochal shift in how we relate to language, writing, and technology. The rise of AI text is forcing us to reconsider what constitutes authentic human expression. Maybe the quirks we now read as a machine “tell” are actually vestiges of our own literary heritage, reminding us that as digital life accelerates, the artifacts of thoughtful, artful writing risk fading into history. As we adapt to this hybrid future, the real story may be how we negotiate the boundary between old and new, formal and informal, human and AI, one dash at a time.

Fading Relevance

It was nostalgic to read this essay about writers I loved growing up that no one talks about these days. It seems as if a quiet erasure is overtaking American culture: the works and artists of the mid-20th century, once considered foundational, are rapidly fading from public memory. Renowned authors like John Cheever, John Updike, Saul Bellow, and Ralph Ellison, whose books were once literary staples, are now strangers to younger generations. Their novels, short stories, and even Pulitzer Prize–winning operas and Broadway works from the era struggle to find audiences, and institutions that once celebrated such cultural milestone now rarely feature or even acknowledge them. Back in the day when I could not have enough of John Updike, I would find it impossible to conceive of a time when he turned completely irrelevant. 

This disappearance extends beyond literature. Film and music from the 1940s and 1950s, including what many consider masterpieces, are increasingly absent from streaming libraries and concert halls. Netflix algorithms offer modern documentaries in place of “Citizen Kane” or “Casablanca,” and jazz radio stations rarely play early Duke Ellington or Charlie Parker. The effect is that even as America’s cultural influence was at its peak, the nation’s most creative works are vanishing from the daily cultural conversation. To have not watched movies like Citizen Kane, Casablanca and others that each shaped and defined genres is a big miss. 

The root of this erosion, the author argues, is not only generational forgetfulness but the nature of digital culture itself. The internet and its platforms, always steeped in the “now,” create an illusion of immediacy that erases the weight of history. Unlike libraries or museums, streaming services and social feeds privilege content that is current, clickable, and algorithmically favored, sidelining anything that doesn’t fit the digital present. Repetition and meme culture triumph, while rich traditions and earlier masterpieces risk being crowded out.

Yet not all is lost. There are signs of revival in smaller, passionate communities, like young jazz musicians rediscovering pre-war swing, or film buffs obsessing over classic cinema, and the very architecture of libraries and archives still embodies the responsibility we have to remember and cherish the past. The challenge is to resist the web’s tendency toward amnesia and recognize our responsibility: to preserve, pass on, and continue creating cultural legacies that will outlast fleeting trends and digital churn. 

Healthy Dissent

Why Aren’t Professors Braver?” explores the pervasive culture of self-censorship, anxiety, and risk aversion among contemporary faculty in American higher education. The article highlights the paradox that, despite their roles as educators, intellectual leaders, and supposed guardians of academic freedom, many professors hesitate to speak openly or take principled stands on controversial topics. This hesitancy stems from a host of pressures: concerns about student evaluations, fear of professional backlash, and broader institutional cultures that often reward conformity and caution over boldness or dissent.

The author observes that universities, once idealized as bastions of debate and fearless inquiry, have become environments where faculty can feel scrutinized by administrators, students, and even the public for airing unpopular views or trying innovative teaching methods. The omnipresence of social media and online outrage heightens this caution, as even small missteps or offhand comments can go viral, risking reputational or career consequences. Professors increasingly calculate what is “safe” rather than what is intellectually rigorous or truthful, with many admitting frequent self-censorship in lectures, publications, or departmental meetings.

Underlying this is a structural issue: tenure, long thought to guarantee academic freedom, is less common and less protective than ever, while contingent and non-tenure-track faculty face even greater insecurity. Administrators and boards often prioritize institutional reputation and enrollment over faculty autonomy, creating incentives for professors to avoid controversy. The result is a risk-averse professional climate where the cost of bravely challenging prevailing orthodoxies can be social isolation, reputational harm, or even job loss.

Ultimately, the piece suggests that this culture of timidity and conformity is unhealthy not just for individual academics, but for the intellectual and civic life of universities as a whole. A more vibrant academic culture would require not only formal protections, but a recommitment to genuine curiosity, robust discussion, and the cultivation of moral and intellectual courage. Without it, higher education risks drifting further from its mission of fostering fearless inquiry, leaving both faculty and students deprived of the encounters with ideas and questions that make academic life truly meaningful.



Silently Strong

I watched Babel recently and it got me thinking about the character I found most moving among several strong contenders. That led me to this essay that helps the viewer understand their response to Chieko. Alejandro Gonzales Iñárritu’s film “Babel” is a striking meditation on the barriers that divide humanity, be it language, culture, or circumstance, and the universal yearning for genuine connection. Psychiatric Times highlights how Iñárritu, through his fragmented, nonlinear storytelling, constructs a kind of cinematic “Rorschach test” that invites us to grapple with the motivations, traumas, and struggles of his characters. In “Babel,” empathy, what psychiatrists call “the universal language of the human heart”, emerges as a critical bridge in a world too often fractured by misunderstanding and authority-driven cruelty.

Among the film’s deeply interwoven narratives, Chieko’s story stands out for its emotional complexity. A deaf Japanese teenager coping with profound grief and alienation, Chieko’s behavior, her rage, sexual impulsiveness, and risky vulnerability, confounds those around her. Through a series of aching, wordless encounters, especially a raw and pivotal moment with a compassionate detective, Iñárritu shows the unspeakable pain that lies beneath her actions. Scenes between Chieko and her father, as well as the enigmatic note she gives the detective, reinforce how much can be communicated by gesture, touch, or gaze even when language fails.

The film’s ambiguity invites speculation and self-reflection: is Chieko’s turmoil simply that of a young woman wrestling with sexuality, or is it rooted in unresolved trauma, depression, or even family dysfunction? One of “Babel’s” powers is to resist easy answers, instead prompting viewers to fill narrative gaps with their own emotional reading. This technique, echoing the viewer’s role in the therapeutic space, brings new depth to the portrayal of psychiatric realities on screen.

“Babel” also stands as a technical masterwork, with Rodrigo Prieto’s cinematography lending each storyline distinctive color and atmosphere, from neon-lit Tokyo to Morocco’s desolate mountains. The interplay of editing and film stock further blurs the lines between stories, intensifying the emotional tempo and artfully capturing the disjointedness of modern experience. This collage method, borrowed from Tarantino but stripped of violence in favor of weary realism, anchors the film’s exploration of how deeply social structures and accidents of fate can shape human lives.

Ultimately, “Babel” is not just a film about despair; it is also about the hope and redemption possible through authentic human connection. Like the best psychiatric encounters, Iñárritu’s work leaves space for healing and meaning, even amid suffering. It is an invitation for practitioners and viewers alike to recognize the invisible bonds that unite us, and to practice empathy as a universal language, reminding us that the heart of psychiatry, and perhaps of art itself, lies in our capacity to reach across divides and truly see each other. I was Chieko as tragic, wounded and yet wonderful. She displayed the courage to be expressive in ways that would be deeply uncomfortable to her and whoever was invited into the spectacle. In that she is supremely powerful and cannot be ignored even if she cannot speak.



College Scam

Read this story in Slate recently about a scam that a bunch of college-bound students stumbled into, In the mid-1990s, a group of ambitious American students, drawn by brochures, recruiting booths, and an alluring narrative, set out for the academic dream of Oxford, only to find themselves enrolled at the obscure Warnborough College on Boars Hill, far from the hallowed halls they’d envisioned. Lured by marketing that artfully blurred the line between Warnborough and the prestigious University of Oxford, these recent high school graduates believed they were heading toward a first-class British education and a launching pad for future success. Instead, they faced confusion and disappointment as it quickly became clear their new school wasn’t associated with Oxford at all, and that even the credits they earned there would be worthless.

Some tried to make the best of it, finding small silver linings in Warnborough’s tight-knit, international community, interesting classes, and British adventures. Yet, the reality of misleading advertising and the lack of accreditation soon hit home. Students who left within weeks faced lost scholarships and significant debt, while even those who stayed ultimately learned that their coursework would not count toward degrees at real universities. The ensuing scandal drew media scrutiny on both sides of the Atlantic and led to court cases, public humiliation, and a deep sense of betrayal among the students and their families.

Warnborough’s president, Brenden Tempest-Mogg, and the college administration consistently denied responsibility for any misrepresentation, attributing blame to misunderstandings or the actions of others. Yet legal rulings found that the promotional materials were misleading, and Warnborough was barred from participating in U.S. federal loan programs and ordered to pay restitution, money the students never received. For many, the incident derailed carefully laid plans for academic and professional advancement, burdened families financially, and left emotional scars.

Looking back, the Warnborough saga stands as a cautionary tale about the high stakes of college admissions and the importance of due diligence, especially for first-generation students and those from less privileged backgrounds, who may lack the resources or knowledge to vet international programs. The story also highlights the persistence of hope, the ways young people try to adapt in the face of challenge, and the long-term impacts that a single misstep in the pursuit of academic aspiration can have on a person’s life. This kind of scamming is alive and well in both US and Canada to this day.


Old Ideas

I am naturally curious about Gen Z being the mother of one and also running into them in the workplace. This Atlantic essay about their unique relationship challenges did not feel so unique after all. The vocabulary may have evolved along with the means of communication but the concepts have been around for ever. 

For much of the late 20th century, American dating followed a familiar, linear script, one so entrenched that even pop songs like Meat Loaf’s “Paradise by the Dashboard Light” could map intimacy as a baseball diamond, complete with “first base” and “home run.” Today, however, Gen Z has largely rewritten the rules. The base system and its implied sequence of romantic milestones are now viewed by many young people as outdated or even ironic, replaced by a profusion of new terms and scenarios that reflect far more fluid, ambiguous, and individualized approaches to sex, connection, and commitment.

Young adults in Gen Z use a vast vocabulary including “sneaky links,” “zombies,” “breadcrumbing,” “situationships” to describe a wider array of relationships, many of which blur or disregard traditional stages and labels. These shifting social dynamics have both liberated and complicated romance. It’s now just as common for a serious relationship to grow out of a long-distance online bond, or for intimacy to precede any emotional label. The pressure to hit milestones has faded, replaced by encouragement (and sometimes confusion) to move at one’s own pace, whether that means exploring kinks as a teen or remaining a virgin into adulthood.

This new “buffet” of dating and sexual options lets Zoomers chart highly personal paths, but the abundance of choice brings its own stress. Many say they struggle to find guidance, despite the internet offering endless information. Paradoxically, while Gen Z has access to more data about sex and relationships than any previous generation, they often lack the tools or role models to decide what truly feels right or how to communicate about vulnerability and preferences. In navigating this ever-evolving landscape, some fall into patterns of avoidance, performative coolness, or emotional detachment, preferring the safety of casual liaisons or ambiguous “talking stages” over labeled relationships.

Sexual openness can be both freeing and fraught. Gen Z is more likely to embrace “enthusiastic consent” and candidly discuss sexual pleasure, but porn and digital platforms have also shaped new expectations, sometimes pushing boundaries before emotional intimacy is established. At the same time, peer acceptance and the ever-watchful gaze of social media means there’s sometimes more fear around being emotionally vulnerable (like public hand-holding) than around physical intimacy itself. Many Zoomers agree: “Sex is easy, emotional connection is hard,” and there’s a collective ambivalence about exposure, risk, and potential for embarrassment.

Yet, for all the contradictions, one constant remains: the desire to connect and the anxiety that comes with it. Gen Z’s new landscape allows for greater self-directed exploration, but also reveals a generation sorting through loneliness, self-consciousness, and the pressure to seem unbothered. Some ultimately do reject the chill exterior, choosing to pursue connection and clarity even if it means braving the very feelings and risks their cultural script tells them to avoid. In the end, as much as the specifics change, the basic human longing for intimacy, acceptance, and understanding endures.

Some variant of all they are experiencing has existed for ever. Maybe the speed of information exchange compresses the time to decision so significantly that new words are needed to describe what is happening. I would argue for instance the concept of situationshiip is not novel. A couple in an arranged marriage that is mostly loveless and only mildly friendly and the parties are agreed on a don't ask, don't tell policy about extracurricular activities has been around forever and is exactly the same thing. It only had the "benefit" of a socially recognized union that brings certain value to the couple.

Loving Workaholism

As someone who as always worked hard not to get consumed by work and have strongly advocated it those I love and care about, I am no fan of workaholism and have never been a victim. My attempts to rescue some who are have generally failed but it does not stop me from trying. It always feels like the right thing to do. Reading this essay made me think of a couple of people I know who have been overdue for a reset. Workaholism is a modern epidemic that has quietly taken root in corporate culture, seducing high achievers with the illusion that working harder will ultimately buy them love, acceptance, or a sense of worth. Endless hours spent at the office, a constant mental preoccupation with job tasks, and the inability to disengage are common symptoms for many professionals and yet, for all this effort, life often feels hollow and relationships suffer. The tragedy is not just the exhaustion, but the false belief that affection or happiness can be earned by relentless toil, when in reality the returns diminish and emotional needs remain unmet.

At the core of workaholism is a powerful psychological script, often learned in childhood, that love is contingent on achievement. Many workaholics unconsciously believe that to be lovable, they must excel and succeed, mirroring patterns seen in families where love and praise were tied to good grades or accomplishments. Though intentions may have been good, these early messages can morph into a lifelong compulsion to overwork, long after the original sources of approval have faded, driving people to pursue extrinsic rewards with the hope of finally earning the intrinsic ones they truly crave.

The consequences are severe and wide-ranging. Studies show that workaholism is linked to sleep problems, burnout, anxiety, depression, weight gain, high blood pressure, and even strained relationships with family and friends. The compulsion to work often robs individuals of the very connection and happiness they seek, with diminished well-being, little time for self-care, and growing emotional distance at home. Ironically, the more one overworks to gain approval or avoid discomfort, the greater the risk of losing it all, health, relationships, and meaning.

Breaking free from workaholism requires facing difficult truths: recognizing the roots of these compulsions, separating self-worth from work, and deliberately investing time and energy in relationships, offering love and presence, not just financial support. Recovery is a gradual, intentional process, one that may call for habit change, new boundaries, and open conversations with loved ones. The great lesson is that working harder won’t make us more lovable, only being present, open, and caring will. True satisfaction and connection come from giving and receiving what really matters, beyond the lure of professional achievement.


Seeking Biology

I grew up in a family of engineers and experienced skepticism from folks around me about biology as real science. The "real" seemed to apply for math, physics and chemistry. Being quite deplorable at biology myself, I never got to the point where I could hold an informed opinion. My family and social network followed a well established pattern of thinking. For centuries, our understanding of reality has been dominated by the Newtonian paradigm. It calls for a vision of the universe as a giant clockwork machine unfolding according to fixed, deterministic laws. Newton’s framework taught us to identify key variables, write universal laws, and define the relevant phase space in advance, a recipe that works spectacularly well for classical physics and even underpins modern quantum theory, which, despite its probabilistic elements, still operates within fixed phase spaces and deterministic evolution of probability. This worldview brought structure and predictability but subtly implied that everything, even apparent randomness, was just a matter of incomplete knowledge.

Yet, biology quietly refuses to fit this mold. The evolution of life on Earth shows us a system that perpetually reinvents itself in ways no set of pre-written equations can capture. The biosphere is not a static assembly of predetermined possibilities. Organisms continuously repurpose their environment, invent new uses, and bring genuinely novel adaptations into being, forever expanding what is possible. This evolutionary creativity can’t be encompassed by a fixed phase space or predetermined variables; instead, it demonstrates that order, complexity, and meaning arise through open-ended, unpredictable processes. The emergence of new species, ecological interactions, and traits are not only beyond prediction but can’t even be fully specified in advance.

The realization that life and perhaps reality itself transcends reductionist, law-bound frameworks has profound implications for science and philosophy. Rather than seeking a “theory of everything” grounded solely in physics, we may need to embrace the unpredictable, self-creating aspects of biological evolution as a fundamental feature of reality. Creativity, emergence, and surprise become central threads in the fabric of existence, inviting us to see biology not as a messy specialization, but potentially as the true key to understanding how the universe invents, adapts, and flourishes beyond the boundaries of deterministic laws.


Uncensored Library

In an era of increasing digital censorship and shrinking press freedoms, creative solutions are emerging from unexpected corners of the internet. One of the most remarkable examples is The Uncensored Library, a sprawling project built within the world of Minecraft and released by Reporters Without Borders in 2020. Crafted by the teams at BlockWorks, DDB Berlin, and .monks, the library uses the popular game’s mechanics to circumvent restrictions on the free flow of information in some of the world’s most authoritarian regimes. By leveraging Minecraft’s global accessibility, the project provides access to banned journalistic works from countries like Mexico, Russia, Vietnam, Saudi Arabia, and more.

The library itself is a masterpiece of digital architecture, composed of over 12.5 million Minecraft blocks and designed to evoke the gravitas of grand institutions such as the New York Public Library. Each country represented in the library has its own dedicated wing, complete with articles both in English and their original languages. The rooms reflect local challenges and journalistic risks, with thoughtful, symbolic interior design. Reading these works in-game becomes an act of digital resistance, allowing players to access censored articles on topics such as government crackdowns, unjust punishment, and COVID-19 reporting transforming a beloved sandbox game into a tool for activism.

The format is as innovative as the purpose: in-game books on stands can be read by multiple players simultaneously, and the library’s central hall is adorned with press freedom rankings and tributes to those who lost their lives for reporting the truth. As of its launch, over 200 texts had been uploaded, curated to spark awareness and debate among a global audience of gamers, students, and digital citizens. The project’s ability to blend education, activism, and gamification has propelled it to viral status and earned it a 2022 Peabody Award for Interactive storytelling.

More than just a clever workaround to national firewalls, The Uncensored Library is a beacon for what is possible when technology, art, and the quest for truth converge. At a time when traditional channels are frequently blocked or surveilled, this digital monument champions the cause of free expression and demonstrates how virtual worlds can empower real-world voices. The library stands as an enduring testament to both the challenges and the creative potential of defending press freedom in the digital age.

Crowd Anxiety

I am not at all comfortable speaking to large crowds and do way better in smaller settings that have been more the norm for me professionally over the years. So I do recognize stage fright as real thing and have experienced it several times. However popping a pill to feel less anxious seems like a very dubious solution to the problem

Propranolol, a beta blocker originally approved for cardiovascular issues, has become a popular, fast-growing off-label remedy for anxiety, especially among young women preparing for public events, presentations, and even weddings. This surge in demand has been fueled by endorsement from influencers and celebrities on podcasts and social media, leading to a steep rise in prescriptions since 2020. Many people, led by prominent voices like Robert Downey Jr. and reality TV stars, see propranolol as a non-addictive “magic pill” for easing nervousness, and its mild reputation compared to traditional anti-anxiety drugs adds to its growing cultural cachet.

Telehealth has further broadened access, making the process of securing a propranolol prescription quick and easy, often just a virtual questionnaire away. Various telehealth platforms and startups openly market the medication online following influencer testimonials, touting its ability to transform performance under pressure. However, some doctors express concern that remote consultations may not properly assess contraindications, such as asthma or diabetes, potentially exposing users to fainting or other serious side effects, and some worry that users can easily manipulate self-reported health data.

Despite being considered safer than benzodiazepines for mild, situational anxiety, propranolol’s status as an off-label remedy for nerves is not firmly supported by robust clinical evidence. In the UK, concerns about abuse and toxicity have prompted official investigations and stricter guidance following overdose-related deaths among young people. Users and critics debate whether everyday nervousness truly merits medical intervention, or if reliance on pills stunts the development of critical coping skills.

The broader response to propranolol’s rise reveals social anxieties about over-medication, influencer-driven health trends, and the lack of access to psychological therapies, especially for young people struggling with normal or situational stressors. While some report life-changing benefits for performance anxiety and stress, others raise concerns about lost resilience, side effects, and the wisdom of seeking pharmaceutical fixes for challenges that might otherwise be managed through experience or therapy. This debate highlights an ongoing tension between easy medical solutions and the complexity of emotional well-being in modern society.

Wouldn't it be so much better for all concerned to create opportunities for folks like me to experience that discomfort over and over until it becomes comfortable. There could be augmented reality options to achieve that potentially. 

Fire Hose

Interesting article about the promise and peril of AI in India. Artificial intelligence adoption is surging in India, with the country quickly becoming OpenAI’s second-largest user base and a focal point for major American tech companies like Microsoft, Google, Meta, and startups such as Perplexity. These firms are investing heavily in AI infrastructure and forming high-profile local partnerships, drawn by India’s massive population of 900 million internet users and a market more open to international tech than China. India offers immense reach, with hundreds of millions of users generating vast amounts of real-world data, and provides AI firms a unique “testing ground” due to its linguistic, social, and economic diversity.

However, converting India’s huge user base into a profitable subscriber market remains a challenge, as average prices for AI services in India are significantly lower than in the West, and the number of paying customers is proportionally small. Nevertheless, the strategic value for AI companies lies in access to unparalleled data, engagement, and the opportunity to fine-tune models with diverse Indian inputs. India’s favorable regulatory regime, allowing free cross-border data flows, enhances its attraction, offering AI firms access to a “fire hose” of new, high-quality training material at a time when global data sources are drying up. Maybe content in vernacular languages will get a shot in the arm, was one of the thoughts that crossed my mind as I read the story.

Indian users generally welcome foreign AI platforms, but there are concerns about long-term dependency and the negative repercussions for domestic IT firms. Critics warn that American tech giants, with superior resources, could stifle investment in local Indian startups, relegating India’s tech sector to peripheral service roles instead of enabling it to build and own foundational AI technology. Though India boasts a vast pool of developers, it still lacks a robust base of AI researchers and innovators relative to its population and ambitions.

The article questions whether India will become a genuine innovator or remain predominantly a consumer and data provider for foreign tech companies. The outcome depends on India’s ability to nurture homegrown research, invest in foundational AI technology, and shift from being merely a vital user market to becoming an originator of global AI breakthroughs. Also build a moat with local language content and context that only works for India. It is time to get value from the terrific diversity of everything that makes India the country that it is.

Too Much

I was one of those moms that read a lot of books about how raising kids, specially written by medical professionals. My logic was that I wasn't trained to do the job and did not have role models I could readily emulate and finally each mother-child situation is different. Finally there was no second parent at home who could be a sounding board or hold me accountable. The wisdom of the experts was my way of closing the different gaps. Now as a parent of a adult daughter, I don't feel like I am done learning or am even doing it right. So essays like this one are likely to get my attention.

This board-brush, one size fits all approach of labeling parental behaviors as codependent oversimplify the real psychological impact of blurred boundaries in adult relationships. Cultural norms, the family conditions in which the child grew up and economic realities matter, but they can also excuse patterns that limit independence and hinder growth. Even in societies where multigenerational support is common, a situation where a parent’s identity depends on their child’s, can create anxiety and stall healthy individuation. 

The key issue isn’t offering help, but recognizing when it leads to mutual dependence that keeps both sides stuck. I have read a lot on this topic and the solutions are not all black and white. As a parent, who will experience positive emotions if good things are happening to your kid and negative ones if such is not the case. While it is not need to tied to their whole identity, chance are they will think about where they fell short if the kid is not thriving. There is little value in that as there is not much they can undo. However, that self-awareness could promote change going forward in a good way.

This one is particularly interesting to me. I enjoy helping J think through problems not with the idea of solving them for her but offering ideas that could allow her come to the answer on her own. She may try some of them but not the others. I seek her input on things as well, because she sees things and thinks about them very differently that I do. Her perspective can sometimes unblock me. I am not sure this is a pattern that needs to be broken.

According to family therapist Virginia Satir’s work on family communication patterns, this over-involvement often reflects the parent’s unresolved needs rather than the adult child’s actual requirements for assistance. The parent may be seeking to fulfill their own emotional needs through excessive involvement.


Getting Right

I have long been a fan of Zapier and enjoyed reading this interview. Zapier CEO Wade Foster’s approach to building an AI-first company centers on blending agentic intelligence with deterministic workflows, advocating the “90% rule” where AI handles the majority of a process but humans supervise the crucial last mile. Foster emphasizes that perfect automation is a myth—most agent-generated outputs are about 90% correct, so designing systems with human oversight ensures quality and builds trust, especially in sensitive contexts like renewals and customer interactions. Companies should avoid rushing to full automation; instead, hybrid solutions using both AI agents for flexible tasks and deterministic protocols for predictable processes achieve the best outcomes.

Foster also stresses that AI fluency must be defined per role, not just company-wide, recommending regular hackathons and weekly show-and-tells to sustain AI literacy and cross-team inspiration. Prescriptive selling powers competitive advantage, as buyers now expect sales teams to leverage AI-driven context and recommendations. Critical mistakes that stall AI rollouts include believing agent hype, skipping functional definitions, choosing full automation prematurely, and failing to build an automation-ready culture. Ultimately, the most successful SaaS leaders will be those mastering intelligent hybrid workflows, tailoring AI adoption to specific functions, and keeping humans strategically in the loop.

The model faces several practical limits and risks that can challenge its adoption in other organizations or industries. Regulated fields like healthcare or finance may demand stricter accuracy than a 90% threshold allows, while scaling ongoing training and human oversight can strain company resources. Cultural resistance, workflow complexity, and unpredictable AI errors also pose significant hurdles, indicating that a one-size-fits-all approach is unlikely and that successful hybrid AI deployment depends on context, company readiness, and ongoing adaptation.

Zapier’s own public account highlights frequent missteps in workflows (such as AI hallucinations or broken JSON structures), showing that building reliable and context-aware agents remains a significant challenge even for companies deeply invested in the automation space


Late Adulthood

Per this Atlantic article, America is facing a “longevity revolution” as 100-year lifespans become increasingly common, prompting calls to rethink traditional models of school, work, and retirement. Historically, 20th-century policy achievements like Social Security and Medicare almost eliminated poverty among the elderly, transforming old age from a period of dependency into a stage associated with leisure and security. Communities like Sun City epitomized the “golden years” ideal, but this model was based on assumptions of a shorter life expectancy and a relatively brief period of retirement after a single, long career. I have a friend who retired several years ago and lives in Sun City. She absolutely loves it there but its probably too early to say if she'll continue to see it as living her dream if this way of life goes on until she's hundred years old.

Authors James Chappel and Andrew J. Scott both agree that the retirement system designed in the mid-20th century is now out of step with demographic and fiscal realities. While Chappel calls for further expanding the welfare state for elders, he largely sidesteps fiscal sustainability issues. Scott, by contrast, highlights a new challenge: our added years may be spent in poor health and wasted potential unless society pivots toward prevention, lifelong learning, and workplace flexibility. He argues that policy should prioritize closing the gap between lifespan and health span, and that the concept of retirement itself needs to be reconsidered for an era where living into one’s 80s and 90s is normal.

Scott proposes that we move away from age-based entitlement programs and enable people to remain productive and engaged, regardless of their age, as long as they are able and willing. This approach entails reshaping work environments for older workers, creating opportunities for continued education, and fostering intergenerational collaboration. He introduces the idea of “late adulthood", a new phase between traditional middle age and old age, where people can pursue second careers, less demanding work, and meaningful contributions to families and communities. I would say that I am in a sense trying to pursue late adulthood by doing things that I would have previously postponed. Someone with a higher risk tolerance could a lot more value than I am able to in this phase of life.

Ultimately, the article concludes that the expansion of healthy, satisfying years in late adulthood is potentially the greatest benefit modern society can offer. However, to truly “grasp the gift” of longevity, America must abandon outdated assumptions about aging, invest in prevention and ongoing engagement, and embrace a holistic understanding of aging that recognizes its emotional and social rewards—not just its challenges. The challenge for policymakers and society at large is to create a framework in which longer, healthier lives are both productive and fulfilling, redefining what it means to grow old in the 21st century.


Impending Change

Traditional 9-to-5 middle-class jobs, rooted in the industrial revolution’s factory-based model, are rapidly disappearing as the digital era overturns long-standing work paradigms. The pandemic only accelerated this shift, proving that productivity does not require physical presence or fixed time schedules; instead, flexible and results-oriented arrangements often yield better outcomes. Consequently, organizations are embracing more agile work styles, and old assumptions about workplace structure are being challenged at every level. While many of those things are true, large tech companies are leading the charge in enforcing return to office mandates, requiring people to do performative instead of useful work. Smaller companies are copying the model assuming it must be right. It will be interesting to see where the two opposing forces balance out.

Rather than simply being replaced by robots, middle-class positions are evolving due to artificial intelligence, which is creating new job categories and redefining existing roles. AI acts as a collaborator, automating routine tasks and freeing humans for creativity, strategic thinking, and interpersonal work. The pace of change varies by industry: sectors like technology, financial services, and media are transforming quickly, while healthcare, education, and manufacturing adapt more gradually. Understanding how fast one’s industry will evolve is crucial for planning career adaptation strategies.

Reid Hoffman, LinkedIn co-founder, proposes a three-pillared strategy for career survival: master AI tools, build adaptability as a core skill, and strengthen the unique cognitive abilities that machines can’t replace. Success hinges not on resisting technological change, but on harnessing it, developing fluency with AI platforms and integrating them into daily workflows. Adaptability and lifelong learning now determine professional resilience, while human strengths like critical thinking, creativity, and emotional intelligence become more valuable as routine work is automated.

Ultimately, the true threat is not artificial intelligence itself, but other professionals who leverage AI more effectively and adapt faster. Early adopters gain compounding advantages in skill, opportunities, and network building, and self-assessment is key in planning a transition. The extinction of traditional 9-to-5 jobs signals the beginning of a new era, where proactive learners who see change as opportunity, rather than disruption, will thrive in workplaces that reward strategic adaptation, creative thinking, and continuous curiosity. I see plenty of folks in that category seizing the generational opportunity and running as fast as they can with it but in equal proportion there are those who don't get the point of it all. They are simply unable to work out how they can best partner with AI to maximize their outcomes.

Learning Indochic

I did not know the term Indochic until reading this essay. “Indochic” in postcolonial Vietnam refers to the revival and popularization of French colonial aesthetics in modern Vietnamese architecture, interior design, fashion, and branding. This visual style invokes nostalgia for the colonial past, attracting tourists and affluent locals with imagery of old villas, antique furnishings, and a romanticized hybrid of French and Vietnamese culture. Businesses, especially boutique hotels, cafes, and lifestyle brands, capitalize on Indochic’s allure to offer luxurious, Instagram-ready experiences that evoke sophistication and elegance.

However, this fascination is controversial; critics argue that Indochic perpetuates selective memory and glosses over painful colonial legacies like exploitation and social inequality. By commodifying symbols and motifs from French Indochina, Indochic can obscure historical trauma and contribute to a sanitized narrative of colonial history, sanitized for commercial and aesthetic purposes. These concerns are amplified in discussions on Vietnamese identity, as the style sometimes displaces indigenous values and architectural forms in favor of those seen as “cosmopolitan” or “elite”.

The article concludes that Indochic, while visually appealing and marketable, remains a double-edged cultural phenomenon in Vietnam. Its popularity highlights ongoing tensions between national memory and commodification, where celebrating the colonial past for style and profit risks diminishing complex histories and ongoing struggles for genuine postcolonial expression. The challenge will be for Vietnamese creatives and consumers to engage with Indochic thoughtfully, acknowledging its beauty while critically examining the stories and legacies it represents.

As someone who comes from a once colonized country, it would be more satisfying to see the Vietnamese elite and creative class that finds inspiration in pre-colonial, revolutionary, and contemporary Asian sources rather than French or "Indochic" motifs. High-end cafes, hotels, and fashion brands would market themselves as modern, Vietnamese, and globally connected but with deep local roots—using native materials, traditional crafts, and contemporary forms—rather than romanticizing colonial history for commercial gains. This counterfactual vision would frame the relationship with the colonial past as one of critique and transformation, not nostalgia or elegance

Residual Heterogenity

The MIT article talks about the relatively short-lived advantages accruing from artificial intelligence which is set to revolutionize business and society by streamlining operations, boosting productivity, and making data-driven insights widely accessible. As AI becomes increasingly commoditized through open-source models, abundant talent, fierce hardware competition, and rapidly falling deployment costs, all organizations will be able to implement advanced AI solutions. Historical parallels like the internet, personal computers, and genetic sequencing show that once a technology is universally available, it ceases to be the basis of long-term competitive advantage.

While early AI adopters may enjoy temporary advantages, lasting differentiation will not arise solely from using AI, since competitors will soon gain access to comparable tools, data, and talent. Even supposed advantages like access to the best models, proprietary data, or top engineering talent are increasingly being eroded by industry-wide sharing, academic openness, and rapidly scaling training resources. The performance gap between large and small models is narrowing quickly, further diminishing prospects for unique, sustainable AI-driven gains.

The authors emphasize that true and resilient competitive advantage requires more than technology—it demands what they call “residual heterogeneity.” This refers to the unique creativity, drive, and technical ingenuity that cannot be replicated or commoditized by AI, no matter how powerful or accessible it becomes. Human creativity whether it’s in novel partnerships, unique customer engagement, or innovative product design remains the critical differentiator for companies aiming to rise above an AI-leveled playing field.

Examining the criteria for sustainable advantage value, uniqueness, and inimitability, the article contends that AI falls short on the latter two. Advantages derived from AI solutions, whether relating to products, processes, or strategies, are generally valuable but neither unique nor hard to imitate. Even proprietary data is losing its protective value, as wide adoption of open datasets and synthetic data make it easier for others to catch up.

Ultimately, the authors argue that as AI homogenizes business capabilities, companies must strategically invest in developing the creative capacity of their workforce and embracing innovative, boundary-pushing business practices. The ability to imagine new possibilities, form unexpected connections, and leap beyond what AI can interpolate will remain intrinsic to lasting success. Human ingenuity, passion, and the relationships that nurture such qualities are what will differentiate great organizations even in an AI-pervasive future.


Repeat Lessons

Jeff Lawson, founding CEO of Twilio, shares ten lessons from his journey growing Twilio from inception to IPO, emphasizing the importance of keeping compensation simple and mission-driven. He argues that founders should focus on fairness in pay rather than complex incentive structures, drawing from Daniel Pink’s “Drive”: after fairness is achieved, additional complexity breeds discontent and distraction. Lawson also categorizes scalable developer-focused companies into three groups: Business Development as a Service, Capex as a Service, and Algorithm as a Service, stressing the need to understand where a product fits before building.

Lawson highlights how infrastructure companies are better insulated from disruption during technological shifts like AI, while SaaS companies relying on seat-based pricing face existential threats. He maintains that infrastructure businesses can embrace AI aggressively because their core model is not threatened, whereas traditional SaaS must immediately develop strategies to survive mass automation. Product expansion, according to Lawson, requires forethought; founders should design products with growth in mind, knowing that certain models can limit future innovation and expression.

Another major lesson revolves around the mindset and risk of founders compared to investors: Lawson believes founders put everything on the line: time, money, and reputation, while VCs risk only a fraction of their portfolio. This dynamic should inform negotiations and respect between both parties. He also warns about public company M&A, stating it’s often worse to miss a strategic deal than to make a bad one, and that corporate cash enables different investment logic than conventional VC, sometimes not losing money is prioritized over maximizing returns. This could balloon into an unsurmountable problem down the road.

Mission is central to founding successful companies, not just financial motivation; founders must be driven by purpose to endure inevitable challenges. Lawson sees the AI wave as a once-in-a-generation chance for infrastructure companies, drawing parallels to the impact of mobile and cloud. He emphasizes basic due diligence in business to prevent high-profile fraud, arguing that a lack of fundamental checks invites risk. Ultimately, Lawson’s “meta-lesson” is to always respect business fundamentals, understand market positioning, and put mission above chasing trends to build enduring companies through disruptive times like the current AI revolution.


Imagined Fake

"Superfake” luxury handbags, counterfeits nearly indistinguishable from genuine products, are upending the economics of the high-end fashion industry per this WSJ story. Unlike previous knockoffs, which were often obviously fake, these bags are created with extreme attention to detail, sometimes using stolen digital templates (tech packs) from brands, and are manufactured in Chinese factories that also operate legitimate businesses. Social media influencers promote these superfakes, normalizing their purchase among younger consumers, who increasingly view buying replicas as a rebellion against luxury markups and industry practices. 

I am no expert on luxury brands but enjoy people watching at airports and such and speculate if the brands they are wearing are real or fake, thinking about what signals might contribute to my assessment one way or the other. What is notable is that I almost always assume its a fake and try to see if there is incontrovertible evidence that it is not. There never is, atleast for an uninformed average person like me. So it really does call into question the social signal value of luxe brands. 

Gen Z and younger shoppers are spending billions less on authentic luxury handbags, driven by both economic pressures and skepticism about the high prices versus actual production costs. Counterfeiters exploit this sentiment, offering near-identical bags for a fraction of the price. often claiming to use the same materials and skilled labor as authentic brands. These sellers operate online through encrypted channels, such as WhatsApp, Telegram, and invite-only social media groups, making enforcement difficult and direct-to-consumer shipments almost impossible for customs officials to intercept.

Luxury brands have ramped up authentication technologies, like x-ray machines and chemical analysis, but with the quality gap closing, the industry faces new challenges. Some experts argue that counterfeit goods could even act as a gateway for future luxury customers, while others note the real financial impact and threats to brand exclusivity. Despite rising superfake sales, luxury firms’ minimal investment in anti-counterfeit efforts suggests they are not yet taking the problem as seriously as they do in marketing and brand management.

Building Fast

Another interesting paper on AI. It presents a new AI system that automatically writes expert-level scientific software to solve empirical research tasks, such as data analysis, forecasting, and modeling, where the goal is to maximize a measurable quality metric (for example, prediction accuracy or fit to data). The approach combines a large language model (LLM) with tree search (TS), allowing it to systematically generate, evolve, and select code solutions that improve the score for a given scientific problem. I could see all of what is does being useful for business applications as well.

At its heart, the system works by prompting an LLM with a description of the scientific task, existing code, evaluation metrics, and relevant research ideas sourced from papers or textbooks. The LLM rewrites code candidates which are then scored and explored in a tree search, efficiently navigating the space of possible solutions. The TS mechanism uses an upper confidence bound strategy to balance trying promising candidates (exploitation) and exploring new variants (exploration), leading to rapid jumps in solution quality as improvements are discovered. This would be a very tedious process if done unassisted with AI.

The paper demonstrates the system's capabilities across multiple fields: it outperformed human-made software on public leaderboards for single-cell RNA sequencing data integration, COVID-19 hospitalization forecasting, satellite image segmentation, neural activity prediction in zebrafish, time series prediction, and numerical solution of challenging integrals. Notably, it created dozens of novel method: for example, 40 new approaches for single-cell analysis and 14 for COVID-19 forecasting that beat the prior best human and CDC ensemble solutions in independent benchmark tests.

For scientists and engineers, this AI system has transformative potential: it drastically shortens the time needed to create, test, and optimize research software, turning months of work into hours. It can serve as a co-scientist by generating, recombining, and refining ideas from the literature or domain experts, speeding up hypothesis testing, benchmarking, and discovery pipelines in computational biology, epidemiology, environmental studies, neuroscience, data science, and mathematical modeling. By rapidly producing validated high-performance software, scientists can more quickly explore alternative approaches and advance the frontiers of their disciplines. For non-scientists among us, opportunities abound as well.


Root Cause

In today’s corporate theater of the absurd, CEOs are losing their minds because employees dare to text during meetings. Jamie Dimon is reportedly so enraged he’s taken to shouting at iPads, while Brian Chesky has discovered the radical notion that even he, the genius founder of Airbnb, is sometimes bored in his own meetings. Their diagnosis? Society is collapsing because people can’t sit still and gaze adoringly at PowerPoint slides for an hour. Their cure? Hide the Wi-Fi passwords, impose phone fines, and shame the sinners who check Slack.

But let’s be honest: the problem isn’t that workers are texting. That is merely a symptom of the malaise. It’s that the meetings are unbearable. They are bloated, performative rituals where executives hear themselves talk while everyone else checks out mentally or digitally. If your staff is scrolling through Instagram, maybe it’s because your “mission alignment sync” could have been an email. Forcing people to surrender their phones doesn’t fix disengagement; it just removes the only coping mechanism keeping them conscious. Maybe the execs who are in love with the sound of their own voices and the bombast they can't control should think long and hard about how they can live with the idea of not being the most important or relevant person in the room. That could magically cure a lot of ills.

Meanwhile, the same bosses fuming about “disrespect” are the ones who expect employees to be available 24/7. They’ve trained their teams to live on their devices, and now they’re scandalized when that conditioning shows up in a conference room. It’s not a phone problem; it’s a leadership problem. Employees aren’t distracted because they’re lazy. They are bored out of their minds and they’re multitasking to survive a culture that worships meetings and mistakes busyness for impact.

So before CEOs confiscate phones or install corporate “swear jars,” maybe they should ask themselves a harder question: if no one wants to pay attention when you talk, are the phones really the issue or is it you?

The Player

I watched Robert Altman's The Player recently and it brought David Lynch's Mulholland Drive to mind almost at once. Hollywood has always been fascinated with itself. The two movies seem to stand as sharp but contrasting mirrors. Both expose the gap between the dream factory’s polished surfaces and its darker truths, yet they do so in radically different ways. Altman leans on satire, embedding a murder mystery within the banal routines of studio executives, while Lynch unfolds a surreal fever dream where ambition, desire, and identity collapse into an unending nightmare.

At heart, each film asks what happens when people surrender to Hollywood’s logic. In The Player, Griffin Mill commits murder yet prospers, rewarded precisely because he understands the cynicism of the system. Altman’s Los Angeles is sunlit and ordinary, but its everyday chatter conceals ruthless self-preservation. Mulholland Drive follows the opposite trajectory: Diane Selwyn cannot bend herself to the industry’s demands, and her psyche simply fractures. Lynch’s Los Angeles is shadowy, electric, and uncanny, where the seduction of stardom curdles into despair.

The difference lies in tone as much as outcome. Altman’s critique is sly and ironic, laughing at Hollywood’s absurdity even as he implicates us in its allure. Lynch’s vision is operatic, frightening, and tragic, where dreams literally dissolve into dust. Together, the films bracket Hollywood’s self-image: one reveals the comedy of power, the other the horror of loss. Seen side by side, they suggest that beneath the shimmering surface, Hollywood runs on a mix of fantasy and corruption ranging from funny to  terrifying, but always consuming. I love it when one good movie makes me re-think my experience of another one, equally good.

Living Duplicate

Reading this Barbara Kingsolver quote made me smile as I recalled so many books over the years that did exactly that for me. 

Literature duplicates the experience of living in a way that nothing else can, drawing you so fully into another life that you temporarily forget you have one of your own. That is why you read it, and might even sit up in bed till early dawn, throwing your whole tomorrow out of whack, simply to find out what happens to some people who, you know perfectly well, are made up.

I will admit that there have been movies and shows that created the same experience for me. I had to know how things end for the characters some of whom I was rooting for, yet other who I found deeply unsympathetic and still wanted to know how they fared. You want to know how it is for folks you can't understand. Maybe the way things turn out will help you see their view of the world which till them was obscured from view. 

It almost very mattered that the characters were all made up and the things that happened to the were as well. They way the story is told makes all the difference. 

Creator Economy

Syracuse University has made a bold move in higher education by launching the nation’s first academic Center for the Creator Economy. This ...