Guest Speaker: How It Works at Coalesce New York

Thank you Naomi for the warm welcome, and for the opportunity to share some of my recent research. By day I work in the NBC News Group newsrooms, where I oversee all of the core digital product experiences for NBC News, MSNBC, CNBC, The Today Show, E! Entertainment and Telemundo.

As you can imagine, a lot of our focus at the moment is in preparation for next year’s election, the cycle of which has already started of course. Misinformation is already everywhere, that’s deeply concerning of course, but nothing new. I suspect the next 18 months will see the enormous influence of generative artificial intelligence in accelerating that misinformation. We’re already seeing images created in MidJourney appearing in campaign commercials, and I think this is going to be the first American election characterized by Ai.

Ethical issues are everywhere, and today I’m going to talk a little bit about a few of them. A large portion of misinformation runs on the oil of data harvesting. The information we willingly offer, and sometimes unwillingly offer, about ourselves to those who would seek to benefit from it. We accept obfuscated terms and conditions. We allow passive location tracking. And we share like never before.

I’ve a few prepared remarks I’d like to offer, which are based on a recent research project I conducted at The University of Pennsylvania, where I’m an undergraduate student majoring in Ancient Religious Cultures and Globalization. It may seem a little strange for a religions student to be talking about Ai, but I’m hopeful that by the end you’ll see how these strongly weave together, and then we can open it up to questions.

Over the past year, many of us have engaged in an artificial intelligence arms race to integrate generative tools into products in commercial and brand-driven efforts to appear innovative. Organizations choosing not to participate, or those proceeding with caution, risk the perception of being left behind as generative tools vacuum up audience and attention. Prompting has entered our vocabulary, and it’s getting harder for any of us to determine what’s real. But what is real for many is the fear of the future. Of what these generative tools are going to do to us as citizens. Of technology which surpasses human ability to understand it. Many have been vocal about job loss and human disintermediation. Some are calling for a period of developmental pause. Legislation is already years behind. And our stewardship of the present poses very real risk in the future.

Artificial intelligence is reshaping social norms, issues of identity and citizenship, privacy, and the boundaries we establish with others. As it reshapes who we are as individuals, and who we are to each other, it is even reframing our understanding of the space between life and death. The emerging field of grieftech, which focuses on our capacity to digitize the storytelling of human experience, reducing it to a series of prompted responses stored on a remote server for recall in the future, raises deeply ethical questions of faith, prolonged bereavement and the masking of pain as a fundamental part of human experience. It surfaces deeply problematic, culturally nuanced questions of the difference between can and should, especially when the implementation of technologies motivates an often unintended consequences of reinforcing bias, discrimination, exclusion and hierarchy.

Now, before I get into really what grieftech is and how it works, I want to tell you a story. A few years ago, James Vlahos’ father was dying. James had always been close to his father, and his terminal diagnosis was devastating. And like many of us, James didn’t want to see someone he loved so deeply, pass on. James’ father had always been a storyteller, and a joker. It seemed that for any given situation, his father always had the perfect anecdote through which to pass on his life’s wisdom to his kids. He loved football and family. He loved to cook, and he’d traveled the world in both his personal and professional life. A life that seemed as if it was going to be cut all too short. So James decided to start documenting his father’s stories, recording hundreds of hours of memories from how he met his wife, his humble beginnings, school life, work life, home life. Anything and everything James could get his father to remember. Over time he amassed an enormous library of his father’s memories, all set down in his own words.

When James’ father passed away, James was devastated. He’d known it was coming for a long time, but none of that made the moment any easier. Instead of looking at old pictures, he’d listen to his father’s stories as a way of keeping him alive and remembering a life well lived. Over time, and with the advent of large language models, James was able to train an artificial intelligence model on all of his father’s stories to create what he called the DadBot. A generative voicebot where James could ask the bot simple questions, and a generative version of his father’s voice would dispense his advice, often with pithy anecdotes and always with a wry sense of humor. James could feel as if he was talking to his father from beyond the grave, and keeping him alive. His grief was dulled, and his father’s voice was only a prompt away.

Over time, James scaled the Dadbot into the generative company Hereafter.ai, which offers the same service to its users for a monthly fee. Stories are harvested over time through audio, written pieces and video, and are able to be recalled with prompts and questions later on. Hereafter describes their service as ‘remembrance reinvented’. A modern way of never letting go, and positions itself as the perfect gift for loved ones during the holidays. Hereafter has many imitators, especially as the generative arms race accelerates, and positions itself as a natural extrapolation of the economy of sharing we’ve all grown accustomed to with social media.


So let’s talk a bit about grief itself. Grief is an inescapable part of loss. Grieftech describes these emergent digital products which seek to preserve a person’s essence after death through the extraction of human stories while alive. These are then able to be interactively recalled, as James did, using artificial intelligence to power chatbot experiences where the living feel as if they are interacting with the dead. Products such as Hereafter.Ai, Eternime and the William Shatner-endorsed Storyfile are all good examples of what I’m talking about here.

As I mentioned, grieftech positions itself as remembrance reinvented, but there are deeply ethical considerations for the currently unknown psychological impacts of synthetically prolonged sorrow, the changes in ways we relate to the dead, legacy data privacy, and the commercial rights to likeness left to survivors. Many Hollywood actors are already taking legal steps to preserve their likeness after death. And as custodians of remembrance beyond what’s stored on a product’s servers, it is essential that we establish the deeply human dimensions of responsibility, care and accountability into the decision making processes which go into not just their development, but also their distribution. I argue that grieftech developers bear the same ethical responsibilities to their future users as they do to their current content creators.

Let’s break the term apart a little. As I mentioned before, grief is a critical part of authentic human experience. But learning to live with digital facsimiles masking sorrow arrests the opportunity to go deeper into our relationship with what we believe to happen next. It arrests a pain necessary in embracing the reality of a loved one lost, and this rich relational aspect of human experience cannot, maybe should not, be reduced to a set of recalled responses. Pain isn’t good in itself. But it's a necessary part of human experience.

In my research I spent time talking with many men of faith about this. Priests who experience the pain of others every day, and for whom pain is foundational to their faith. What I heard over and over from them is that ultimately pain comes from love. Perhaps it’s a love lost, but it’s love. You can feel like in the story I told you about James. But if we try to fill our time with something that's really just a coping mechanism, we lose the opportunity to move on. That humans are deeply communal beings. We crave interaction with others and the friendships we carry through life are more than an exchange of information. Too often in digital spaces of innovation we have grown accustomed to numbing our pain with scrolling serotonin and not embracing it as part of what it means to be alive. These are all-too-human problems which should not be masked by synthetic digital solutions, however immediately intoxicating they may appear to be.

But of course, I’m not advocating for a world where technological innovation does not seek to help humans with deeply emotional experiences, or a return to the dark ages. I believe technology has much to offer in grief counseling and our relationship with those no longer with us. But I am advocating for increased responsibility towards the necessary human emotional needs which are masked or prolonged by the use of such products. We seek kindness and care, not faster optimized empathy. That pain of loss comes from embracing a slower feeling of love. That humans are not reducible to exchanges of information or the digitization of their stories. And that facsimiles of loved ones bear a deep responsibility, like the medical field, to do no harm.

These ethical issues surfaced by the fear of the future are nothing new, especially concerning dystopian ideas of artificial intelligence accelerating beyond our control. A discomfort with the very idea of digital memory and preservation beyond death taps into the very real human problems we already have with our own mortality. Resurfacing memory, feelings of nostalgia, and the ability to digitally recall the events of our lives and those we love are powerful motivators of engagement and attention. The means by which we create meaning from our individual experiences of the world surface issues of what’s possible, what’s culturally defined as ethical, and ultimately what’s legal. Individuals may lack the language to articulate this concern but we, as both users and creators still possess the agency and responsibility to be curious about the consequence of equitable exchange. All of these collide inside of grieftech experiences, which at least for now, are often constructed from a distinctly western, individualistic and affluent perspective.

In the United States, existing legislation determines that an individual’s privacy is of financial value, and is ultimately negotiable. So while privacy is for sale, it also comes bundled with large ethical issues of ownership. Who owns an individual’s data, likeness, engagement history or network? Does the product bequeath the eternal use of another’s likeness to its descendants, or others? Legal protections already don't do a lot to prevent abuse of populations that we agree should be protected, especially when those populations can't represent themselves. It’s often the case that we willingly say yes to digital identity harvesting because we value the exchange, at least in the terms that we understand it, and even if we don't have the language to express our consent. But even if those who have given up their privacy in advance, and that consent is present, within grieftech the person who has died is not the end user. Proposed legislation recommends that an individual has the right to access services allowing them to create a trusted identity to control the safe, specific, and finite exchange of their data. But they omit considerations of posthumous use, and the consent to do so, either by surviving loved ones, or by third-party providers.

So what are those creating these experiences obligated to do? The synthetic feeling of keeping someone alive inside a period of grief may have the noble aspiration of easing the pain of loss, but it also carries the unintended ethical risk of prolonging grief itself. Grieftech runs on the personal disclosure of intimate life stories and thoughts, preserved for survivors, and is extremely empathic in substance and intention for surviving loved ones. Grieftech aspires to endure. Therein is the obligation and responsibility as custodians of remembrance well beyond the recurring monthly subscription payment. The legacy of the deceased and the voices of those who survive are those who must be prioritized in matters of ethical conflict and permissions negotiation. The voice of the end user, in this case, the end user in perpetuity, is that which must be protected. This must be built into these experiences as a matter of critical importance.

With any digital product explicitly intended for long-term use, the future is often unclear, and the propensity towards unintended consequence high. Consequences may include the psychological impact of synthetically prolonged sorrow, changes in the ways in which we relate to the dead, legacy data privacy and the rights to likeness left to survivors, and the right to commercialize the stories we leave behind for others. Responsible AI must be deeply ingrained into these products themselves, but also into the economic mechanics which continue to fund such products. Sustainable processes, much like existing frameworks for accessibility or translation investors will already be familiar with, need to be wrapped around the development process itself. Ethically responsible governance in the distribution and responsiveness of artificial intelligence services is deeply shaped by the all-too-human problems of unintended consequence, fear of the future, bias and flawed, culturally nuanced decision-making.

This is where many of the challenges inherent in grieftech collide. It is a legacy-driven, long-term product which is only a few years old and serves emotion in the present. It is not intended to be used in the short term and then dispensed. Preserved in digital amber for future generations, it is expressly intended to endure. From a recurring revenue, and potentially economically cynical perspective, this makes a lot of commercial sense in locking into a platform for the long term. But through a lens of responsibility, the platforms also have to be held accountable to endure, and ensure that emergent existential issues such as model collapse are held at a distance.

If a family’s legacy becomes increasingly stored on remote servers, the platforms themselves have a responsibility to preserve and maintain the legacy of those who have chosen to upload their digital lives into the cloud for future retrieval. Many of the terms of service documents speak to this. That if their company were ever to go out of business they would offer the ability to download what had been harvested. This is standard practice in many parts of the web, but I’d argue doesn’t go far enough in this context. Grieftech platforms have a future-facing responsibility to their customers to endure in ways that social platforms do not. they are explicitly positioned as platforms of remembrance. They trade in the preservation of long-term memory. As such, they need to reflect that in their operational health. In this sense it is like an archive, a library. But not in the sense of temporarily borrow books and returning them. In the sense of preserving the past for the long term, and more like the reverence we might extend to a museum.

So what happens next? Armed with these ethical insights, grieftech platforms have a deep responsibility to endure. As digital custodians of human lives they must ensure that emergent existential issues such as large language model model collapse are held at a distance. Use this research to guide your decision-making and to shape your roadmaps. These are products intended for the future, not the present, and they bear deeply human responsibility to operate in ways which sustains themselves into the future. As investors you control the means by which many of these experiences get developed. In your hands you hold both great power and great responsibility. Let’s exercise it.


Previous
Previous

Guest Speaker: Product @ Penn

Next
Next

Beware The Owls’ Herald