Here Be Dragons: How to slay on the AI frontier
Last week I had the opportunity to speak at the Women in Tech conference in Boston with Erika Dale.
In the opening remarks, Carrie Beckstrom shared a little known fact that “Here be dragons” is how cartographers referred to the space beyond charted territories on maps. Dragons being something we fear, it is a smart analogy for how we might think about the frontier technology–and AI, specifically–creates. There is fear around the unknown and we’re just familiarizing ourselves with the impact of AI in our everyday work. We have to slay dragons to realize the full potential of AI for our organizations.
Here are some learnings and takeaways from the sessions that I think are most important to consider as AI challenges us to slay new dragons.
Takeaway: Much like the benefits of technology in general, companies highlight increased accuracy and efficiency as AI’s key benefits.
Reflection: While this is true, if your AI story stays at this level, your tech brand will be missing an opportunity for clarity and distinction in defining your place on the frontier.
The dragon to slay here is genericism.
Takeaway: The output of generative AI is only as good as the data it’s using. Data is currently biased, as post-data scientists are not yet diverse, so data will remain biased.
Reflection: It’s our responsibility as people to correct the biases we’ve created and democratize data.
The dragon to slay is our own narrow-mindedness.
Takeaway: Generative AI can give us a headstart on our work, removing some of the more tedious tasks, like generating papers and content.
Reflection: If we don’t keep a significant level of human brainpower, we’ll be missing the small ways humanity touches us throughout our days.
The dragon to slay is shortsightedness.
To truly participate in pushing the frontier of technology with the power of generative AI, we need to proactively bring our expansive human-centered mindsets and experiences to the effort and slay the dragons that will want to keep us contained in limiting patterns.