Opinion: The place where AI's influence is felt most is the place we look least closely - the everyday stuff like banking, shopping, travel, CVs, streaming and hospitals
We picture AI as spectacle: lab breakthroughs, humanoid robots, grand announcements. In practice, it arrives as paperwork, queues, and menus. It decides which email you see first and which never reaches you. It nudges a meeting time onto your calendar and a route onto your screen. It moves a chest X-ray up a list and a card payment down a risk score. It adjusts a fare between browser refreshes and a price between clicks. Nothing dramatic; just a steady sorting of everyday life.
Take hiring for example, most CVs never reach a recruiter’s desk. They go into software that pulls out job titles, dates, skills and employers, then matches those details to the wording of the role and to the patterns of people hired before. The process feels efficient because it is, but it is also conservative. If yesterday’s workforce provides the template, the system leans toward more of yesterday: linear careers, familiar badges, tidy timelines. Gaps, zigzags, volunteer work and non-standard routes look like noise rather than promise. People can fall out of the process before any human reads a line, and the gate closes without a sound.
Money works in a similar way, only the stakes are higher. Banks and lenders build scores to predict who will repay and at what risk. They fold in payment histories, the age of accounts, usual balances and recent changes. Insurers do the same with claims histories and exposure. Each step is a probability dressed up as a decision. Alongside this runs the protection layer: systems that watch for card activity that does not fit your past behaviour, and identity checks that compare a selfie to a document and test that a live person is on the other side of the camera. When it works, you glide through. When it misfires, a card is declined or an account is locked and the only explanation is that the machine has said no. These systems are accurate on average and blunt at the edges. They compress risk into a number, and, in the compression, detail disappears.
We need your consent to load this rte-player contentWe use rte-player to manage extra content that can set cookies on your device and collect data about your activity. Please review their details and accept them to load the content.Manage Preferences
From RTÉ Radio 1's The Business, Jobs that AI can't do
Hospitals, too, now rely on queues shaped by prediction. In busy radiology departments, software scans X-rays and CT images for signs that need urgent attention and moves those cases to the top of the list. In emergency rooms, forecasting tools look at time of day, season, local events and historical flows to predict when beds will be scarce. In primary care, risk scores combine symptoms, vital signs and past records to suggest how quickly someone should be seen. Speed is the obvious gain. The subtler change is in attention. Once a list is reordered by a machine, clinicians review cases in the order they were scored, not the order they arrived. Groups that the data under-represents, rare presentations and atypical clusters can slip down the queue without any single person choosing that outcome. The tools are there to help, but help can shape where eyes go first.
Consider travel as another example. Airfares are not fixed prices set by a clerk. They are the output of systems trained on how quickly seats tend to sell on a route, how close it is to departure, how full the plane is and what rivals are charging. The numbers move as those signals change. Two people can check minutes apart and see very different fares, not because someone changed their mind, but because the system updated its view. On the ground, mapping and dispatch tools work from a forecast rather than a snapshot. Your phone, the cars around you, reports of incidents and past traffic patterns feed a prediction of where congestion will be in the next twenty minutes, and the app chooses a route for that future. Multiply this across a city and something odd appears: guidance alters the very traffic it predicts. Streets that were quiet become cut-throughs when thousands receive the same shortcut. What is optimal for one driver can make the day worse for the street they all use.
Shopping is not a neutral shelf either. The order in which products appear online is the result of ranking. Stores learn what people who resemble you tend to choose and move those items forward. Click a brand often enough and it appears more; appearing more encourages more clicks; the loop tightens around what is familiar. Advertising sits on top of that. As a page loads, your attention is auctioned in a blink to the buyer who believes their message will perform best for someone like you, on that device, in that moment. What feels like discovery may be the outcome of that auction. Even prices learn. Sometimes they rise and fall with demand and time. Sometimes they vary by segment: new customer, loyal customer, bargain hunter. Either way, the number on the button reflects a belief about what you will pay right now, formed by a model you never meet.
From We Are Netflix, How the Netflix recommendations algorithm works
Then there is the screen you relax in front of. What you click, how long you watch, where you pause, what you abandon, even the time of day you do it all of it becomes training data. The feed learns your tempo and trims the menu around you; "watch next" is not a guess, it’s a running calculation. Over time, your habits join everyone else’s to shape more than your homepage. They influence which trailers get cut, which thumbnails get tested, and which kinds of shows get commissioned in the first place. Personalisation doesn’t just pick tonight’s episode; it helps decide what gets made tomorrow.
Decisions once made by named people now emerge from objectives set upstream: maximise click-through, minimise default risk, cut queue time. Those targets sound neutral, but they bring values with them about whose time counts, which errors are tolerable and what kinds of lives fit the template. Over weeks and months, these proxies harden into norms. A default becomes a rule, and a rule starts to look like reality. Across these scenes the pattern holds. Software sorts, ranks, prices, flags and routes, mostly out of sight. The benefits are obvious: faster service, fewer fraud losses, smoother interfaces and cheaper operations. The consequences are quieter: narrow funnels, silent exclusions and attention steered toward outcomes that suit the model’s goals more than your own.
Read more: Is AI a problem or a solution when it comes to hiring staff?
None of this requires a villain. It is optimisation. But optimisation hides choices. Someone decides what to measure, what to maximise and what to ignore. When those choices move from visible policies to hidden models, the power to shape everyday life moves with them.
AI did not arrive with flashing lights. It seeped in through defaults and settings and stayed because it made things easier. That is why it matters. Not because it is spectacular, but because it is routine. The place where its influence will be felt most is the place we look least closely: the everyday.
Follow RTÉ Brainstorm on WhatsApp and Instagram for more stories and updates
The views expressed here are those of the author and do not represent or reflect the views of RTÉ