When you walk into a museum, you enter a curated environment. The physical space itself provides context. Docents can guide discussions. Warning signs can prepare visitors for challenging content. But in a digital space, that careful curation and context can disappear with a single scroll or click.
This realization shaped our first framework principle: human moderation for sensitive content. While artificial intelligence can do amazing things, we believe that decisions about nudity, sexuality, and violence in art require human judgment, cultural understanding, and careful consideration of context.
Our second framework principle focuses on content sources. We maintain a 70/30 split between museum works and public contributions. This isn’t just about quality control – it’s about creating a space where institutional expertise helps frame broader artistic dialogue.
But even museum-sourced content requires careful consideration in digital spaces. Take Edvard Munch’s “Puberty,” a renowned work displayed at the National Museum of Norway. When we included this piece on our platform, we discovered something surprising: even with proper context and historical background, some users found the digital presentation deeply troubling. This taught us that digital presentation can fundamentally change how art is perceived and experienced.
The Algorithm Question
Perhaps our most crucial decision was choosing not to implement recommendation algorithms – yet. This might seem counterintuitive in today’s tech landscape, where engagement-driven algorithms dominate. But our research study funded by Stanford University revealed that users require guides through sensitive subject matters.
The New York Times recently exposed how Instagram’s algorithm-driven engagement metrics can create dangerous dynamics, showing how recommendation systems can inadvertently encourage problematic behaviors and create echo chambers of inappropriate content. When algorithms optimize for engagement alone, they can amplify concerning patterns: users who linger on certain types of content receive more of it, potentially developing unhealthy viewing habits or even pathological behaviors.
This research has profound implications for art platforms. Consider:
- A user interested in classical nudes might be served increasingly suggestive artworks
- Someone exploring dark themes in art could be pushed toward more disturbing content
- Viewers might get trapped in narrow artistic perspectives rather than experiencing diverse works
- How to encourage healthy exploration of art
- Ways to promote diverse artistic perspectives
- Methods to prevent algorithmic reinforcement of problematic viewing patterns
- Approaches that prioritize educational value over addictive engagement
This isn’t just about avoiding harm – it’s about actively building something better. As we develop our next app update, we’re focused not on maximizing daily active users or time spent, but on understanding what makes a great art patron and how we can nurture those qualities through technology.
We believe algorithms should serve art appreciation, not the other way around. When we do implement recommendations, they will be designed to:
Building Something Different
Unlike platforms that optimize for engagement at any cost, we’re asking more fundamental questions: What makes a thoughtful art patron? How can technology foster meaningful engagement rather than compulsive browsing?
The answer is emerging through our partnership with museums. When museums join our platform, they’re not just gaining another digital tool – they’re joining a mission to reimagine what art engagement looks like in the 21st century.
Consider today’s common scene: visitors rushing through galleries, pausing briefly for selfies, treating museums as just another checkbox on their social media feed. But what if we could cultivate a different kind of patron? Imagine visitors who are already informed and ready to experience the museum for all the knowledge and inspiration it can provide.
This is why we’ve chosen not to implement engagement-driven algorithms. Recent investigations have shown how recommendation systems can create problematic patterns, serving users increasingly concerning content based on their viewing habits.
Museums joining our platform aren’t just adding another digital channel – they’re participating in a fundamental shift in how we think about digital engagement with art. Together, we’re working to cultivate not just visitors, but true patrons who will help sustain and strengthen museums for generations to come.
This approach requires patience. It means sometimes choosing slower growth over viral moments. It means measuring success not by likes or screen time, but by the depth of engagement and the quality of discussions. But we believe this is how we build something truly valuable – and truly safe – for art institutions and their communities.