Fragmented Structure

I watched the movie 38 recently and struggled to form any real connection with it. That got me thinking about the kinds of stories films tend to tell, and how those patterns shape our expectations. More often than not, movies feel formulaic because they overwhelmingly follow a centuries-old narrative structure, rooted in Aristotle’s Poetics and later formalized into the three- or five-act format and the “hero’s journey” monomyth. This structure, now codified by screenwriting gurus and the story-structure industry, is so pervasive that even experimental films often conform to its hidden scaffolding.

The formula persists because it offers audiences a sense of order, transformation, and hope, but it also reinforces conformity, subtly promoting conservative values and the status quo. Maybe that is what the audience craves or perhaps the "system" wants them to. 

38 rejects all of that. It breaks away from the traditional hero’s journey and linear storytelling in favor of a fragmented, introspective structure. Instead of following a clear arc of conflict, transformation, and resolution, the film immerses viewers in the fractured psyche of a thirty-eight-year-old woman obsessed with the social media life of the younger woman who ended their relationship. The narrative unfolds through vivid interruptions of sound and image, reflecting the protagonist’s internal turmoil and the blurred boundaries between online and real life. It refuses to move toward a tidy conclusion or any kind of restored normalcy.

While I appreciated the movie on its artistic merits, it left me feeling unsatisfied and disoriented, maybe because I missed the familiar scaffolding I’ve grown so used to.


Front Door

I was chatting with a friend who works at a B2C company with a deplorable website, one that’s needed an overhaul for at least a decade. They’ve built a loyal customer base, but leadership knows their window to adapt is closing fast.

B, who leads digital transformation there, says AI agents like ChatGPT, Perplexity, and Gemini are fundamentally redefining what digital transformation even means. These tools are shifting discovery, recommendation, and transactions away from traditional brand websites and into AI-driven interfaces. As a result, website UX is becoming less critical, while real-time, structured product data, delivered via APIs and feeds, is becoming essential for visibility in these systems.

For B, this shift is huge. Her company is miles behind the competition digitally, but now they have a chance to leapfrog, do something bold, stop optimizing for human visitors and start optimizing for AI models. Their sad, dated website becomes just a reference layer; the AI interface is the new front door for their customers. They can just work on getting that door right.

I asked B if she finds this exciting. She was ambivalent. Only if we get it right with the AI agents and don’t screw it up like we did with web and mobile, she said not sounding confident her leadership had the vision and drive to make this happen. This was in interesting conversation specially in light of what D has been saying about the fate of UX in the world. 

Becoming Reliant

At happy hour recently, a friend of a co-worker who works at an AI startup compared the current widespread use of AI to the early days of Uber. R noted that AI tools (like the one her company makes) are heavily subsidized by venture capital and have quickly become indispensable for many users. She pointed out correctly that these subsidies will eventually disappear, leading to higher costs or increased influence from advertisers. 

Uber is essentially unaffordable in most cities I travel to for work and often the local taxi service is more cost-effective. So R is perfectly right about what the future golds. There were a couple of engineers there with AI startup and side hustle ideas and R told them, if they plan to build with AI, they should act now while it remains affordable. That got the group chatting about the  importance of maintaining traditional skills, as those who can produce quality work without relying on expensive AI services will be highly valued if prices rise. 

This is particularly true I think for early-career professionals who need to wean off their dependence on AI to do a job they haven't even had a chance to fully learn. If you are closer to retirement age, the consequences in professional life might be less dire. You might have runaway to enjoy the productivity boost from the tools while the going is good as R says and happily walk into the sunset to enjoyed retired life. 

Our group covered a large age range - fresh out of college to getting ready to retire this year. R gave everyone food for thought that evening. She herself is all-in with AI tools at work because its not an option. But for an industry veteran as she is, its likely a pretty low-risk. She had a great career before AI and will likely continue to have after the dust settles and things go the way of Uber.

Saving More

I did not know about Zepto but found this strategy interesting. They have a new ‘Swap and Save’ feature, which automatically suggests cheaper alternatives to items in a customer’s cart. Needless to say, it has sparked backlash from several direct-to-consumer (D2C) brand founders. 

While Zepto promotes the feature as a way to help customers save money, D2C brands argue it undermines premium positioning, disrupts customer conversion, and increases pressure to spend on advertising just to remain visible in shoppers’ carts. There is concern that the feature could force brands to offer discounts or pay to avoid being swapped out, favoring larger advertisers and making it harder for smaller brands to compete. Despite the controversy, Zepto is still testing the feature and planning on a wider roll-out. 

The problem viewed from the lens of the consumer looks quite different, atleast in the short run. If a person is on a tight budget and there is service that helped them do more with less, its only a win. The fact that this creates a race to the bottom is not their greatest concern. There is more D2C brands could go a lot further to help consumers - track their shopping cart or wishlist and buy when the prices are the lowest, swap and save when the customer says that is okay to do that but stick with the requested brand when its not.

Too Good

 A former colleague who is looking for a new job shared his recent interview experience. After a positive interview, L was rejected for a job because the company felt his resume was "too tailored" to the position. He expressed frustration that, unlike many applicants who use AI tools to generate customized resumes, his resume was genuinely written by him and accurately reflected his real skills and accomplishments. He believes that hiring managers are now struggling to distinguish authentic applications from AI-generated ones, resulting in flawed IT hiring practices. He also criticized the interview process for being ambiguous and often led by interviewers who seem unqualified to assess candidates properly, something I have heard from a lot of other folks too. 

That last part about unqualified interviewers struck a chord with me. When people luck into roles that do not deserve and lack the skills to perform, they create a terrific drag force for the company. They will do their best to keep out anyone (like L) who has demonstrable ability to perform (and therefore outshine them handily). They will also block hiring of managers who will easily identify that they are unfit for the role and not performing at the job. That kind of hire would jeopardize their own jobs. The force of resistance to qualified hires will rise in exponential proportion to the salaries of these folks. So we have a situation like L is running into. His qualifications seem too extraordinary to be be true when viewed by peers and managers less competent than him. They immediately conclude this could not be real and AI made up the resume for L. That can now be a great excuse to reject him. I do believe things will change.

Seeking Rare

An UX designer I worked with a long time ago, recently shared a long rant about the AI generated design. In D's opinion , generative AI tools, including new ones like those in Figma, can be useful, but they don’t replace the need to understand the design process. True design begins with a clear goal and involves thoughtful exploration and refinement, which AI often skips by jumping straight to a finished result. This shortcut can undermine creativity and the critical thinking that shapes strong design outcomes. Ultimately, while AI can assist, it’s the designer’s own understanding and intent—the “why” behind every decision—that drives meaningful results. The tool is just a tool; real intelligence comes from the designer.

I have some sympathy for his cause but not a whole lot. The "real intelligence" that D speaks of is not abundantly available in the UX community. A lot of folks I have worked with over the years follow the design process like it was a guarantee of outcome. As long as the full battery of artifacts are deployed, we will arrive at the right answer seems to be the thinking which is extraordinarily flawed. I have had designers go through days and weeks of such sessions and not produced deliverables that are in no way better than what AI can produce these days. 

There are those that do bring something special to the table and I have had the privilege of working with them as well. These folks are able to simplify and distill very complex ideas into something elegant, intuitive and simple and they can do this just listening to users closely, asking them thoughtful questions along the way, no frills, no involving works-shops. The first version of their concept can even be pen and paper. But they get it right. AI can take those rough notes and expert direction to execute, save them the mundane work. D may not like to hear this said about his beloved design community but the truth is real talent that cannot be replaced by AI is rare and that caliber of talent will not get displaced. 

Being Adult

Any parent who has experienced their child attaining adulthood has wondered at what age that becomes real adulthood and not conceptual. 

..everybody is unique, there’s no standard timeline for growing up. Some people learn how to control their emotions, develop the judgment to make good decisions and manage to earn enough to support themselves by the age of 18.

There are so many facets to a person that it would be hard to declare someone an all around adult because they are demonstrably mature in some areas. They could be woefully behind on the others. 

Growing up is about gaining experiences, making mistakes and learning from them, while also taking responsibility for your own actions. As there’s no single definition of adulthood, everyone has to decide for themselves whether or not they’ve turned into a grown-up yet.

As a parent, you want to see your "adult" child demonstrate the ability to function that meets or exceeds yours at their age. This is assuming you felt that you were a well-functioning adult at that reference age. If the child exceeds you in some areas that is cause for celebration but we are not able to accept nearly as well that they lag us in others, and as such we are reluctant to declare them as fully adult.

Good Direction

Like many I have many reasons not to like Google but allowing users to download and run a variety of open-source AI models directly on their smartphones, without needing an internet connection is great news. The app, currently available for Android (with iOS coming soon), lets users access models from platforms like Hugging Face to perform tasks such as image generation, question answering, and code writing, all processed locally on the device’s hardware. I only hope there's no catch. If this is even somewhat workable, it would represent a shift toward making AI more accessible, private, and customizable by empowering users to run advanced models locally, rather than relying on cloud-based services. 

While that is generally good for users, they have to take on more risk. If they run sensitive or private data through these models, and if their device is not secure, that could present it's own kind of problems. A fully air-gapped device could help. There is also the bad actor problem (though they have plenty of options already). Running AI models locally makes it easier to bypass safety controls and use AI for generating harmful content, coding malicious software, spreading misinformation, or enabling scams and illegal activity. The lack of centralized moderation and the potential for model jailbreaking and adversarial attacks significantly expand the risk surface for misuse. In this instance all of that could happen from a phone which is convenient. 

One can hope Google had some altruistic intentions here beyond trying to apply breaks on the cloud based contenders who are getting ready to eat its ads and search lunches.

Fragmented Structure

I watched the movie 38 recently and struggled to form any real connection with it. That got me thinking about the kinds of stories films ten...