My parents aren't tasting menu people

Mark M

known, both sides, recommendations

The AI recommendation wasn't wrong, exactly. It was romantic. It was special. It just wasn't for them.

My parents aren't tasting menu people

How AI agents make recommendations

Someone told me this story at a dinner party a few months ago. A friend of hers had been trying to find somewhere for her parents' anniversary. "Romantic," she'd said. "Somewhere special. They've been married forty years."

They thought they'd give ChatGPT a try.

It came back with a country house in the Cotswolds. Tasting menu, quiet gardens, the sort of place where you speak softly at dinner. Her parents would have hated it.

They're not tasting menu people. They're walk-along-the-canal people. Pub-with-a-fire people. Her dad would have spent the whole meal worrying about which fork to use. Her mum would have found it stuffy. Forty years of marriage, and they'd have come home feeling like they'd performed an anniversary rather than had one.

The recommendation wasn't wrong, exactly. It was romantic. It was special. It just wasn't for them.

The word doesn't travel well

Romantic means different things to different people. This is obvious when you say it out loud, but easy to forget when you're asking a machine.

To some people, romantic is a long walk that ends somewhere warm. To others, it's a late dinner that turns into dancing. It can mean silence and white tablecloths, or a shared plate in a crowded room. It can mean being completely alone, or being surrounded by people having a good time.

She knew what she meant. Her parents would have known what they meant. But "romantic" on its own carries almost none of that. It's a pointer to something specific, dressed up as something general.

What a good recommendation requires

When you're looking for something yourself, you do this translation in your head without noticing. You scroll past the places that aren't right. You linger on the ones that feel like they might be. The understanding of what you actually want is already there, shaping what you see.

When you ask someone else to find something for you, that changes. Now they have to understand it too.

A friend who knows you well can do this. They know you're not tasting menu people. They know your mum hates fuss. They'd think of the pub on the canal before they'd think of the country house, because they know who you are, not just what you asked for.

An AI making a recommendation faces the same problem. "Romantic" isn't enough. The question is whether it knows anything else.

There are two sets of knowledge. For an AI recommendation to land, you need to know both sides.

You need to know something about the person. Not their age or where they live, but what they're actually like. Whether they'd rather be impressed or feel at home. Whether quiet is peaceful to them or uncomfortable. Whether they'd find a tasting menu exciting or a performance.

And you need to know something about the place. Not just that it has rooms and a restaurant, but what it's actually like to be there. Who feels comfortable. Who doesn't. Whether it's the kind of place where you'd propose or the kind of place where you'd decompress after a long week.

The country house and the pub both exist. They're both "romantic" in some sense. But they're for different people. The recommendation is in the matching, not the finding.

What happened next

That conversation stayed with me, though I knew exactly why. The gap she was describing was the same one I'd been thinking about for months.

A few weeks later, my own parents asked me to help them find somewhere. Different occasion, not an anniversary, but the same impulse. Somewhere nice. Somewhere that would suit them.

This time I knew what to do. I didn't just describe what they were looking for. I described who they are. What makes them comfortable. What doesn't. My dad's the same as hers, actually. He doesn't relax into formality. My mum likes a bit of life around her.

It worked better. After a bit of back and forth it suggested somewhere I wouldn't have thought of, and when I looked it up it made sense. They went. They loved it.

But I noticed there was still a gap. The AI could match my words to descriptions. What it couldn't quite do was know the places well enough. Whether my parents would actually feel at ease there. I was still filling that in myself, reading between the lines, looking at photos and imagining them in the room.

I keep having versions of that dinner party conversation. Someone hears I work with AI, and they have a story. The gap is always in the same place. The AI doesn't know enough about the person. Or it doesn't know enough about the place. Or both.

It's closeable. But it needs both sides to be known.

I keep thinking about her parents, and mine, and how much of what we call "romantic" is really just knowing who someone is.

Mark M