I'm a UX researcher and sociologist interested in how platforms shape—and are shaped by—cultural participation. My work focuses on how people make meaning on platforms like TikTok and Instagram, especially through memes, subcultural aesthetics, and everyday creative practices.
I approach platforms as sociotechnical systems, drawing on Science and Technology Studies (STS) to understand how interface design, governance, and algorithmic systems structure what users can see, do, and become. My research spans both applied and academic contexts: I conduct UX research professionally, and continue to publish scholarly work on digital culture, platform governance, and creative labor.
I take a mixed-methods approach, using discourse analysis, digital ethnography, policy analysis, and platform walkthroughs -- alongside computational methods in R and Python for both statistical modeling and lexical analysis. I'm especially interested in how qualitative insight and quantitative structure can complement each other in studying digital culture.
Also, lots of fun stuff: how humor travels, how aesthetic norms evolve, and how users stretch the limits of platforms in playful, surprising, and sometimes subversive ways.
(Last updated 1/27/2026)
All of the nitty-gritty is documented on my LinkedIn page, but prior to my academic career I worked in K-12 education and operations. I have lots of experience with community-engaged work and with large-scale operations, financial and facilities management, project management, and event planning and execution.
That framing assumes platforms are neutral tools that are either harmful or safe. I study something different: how platforms structure childhood and youth participation through design, governance, and the gaps communities fill. My research in the International Journal of Cultural Studies shows how TikTok governs young users through scaffolding, segmenting, and siloing — layering restrictions, categorizing content for different moderation treatment, and restricting features by age. But formal governance often falls short, which is why my research with Crystal Abidin in Policy & Internet examines "patchwork governance" — how communities develop protective practices around internet-famous children when platform policies fail. The question isn't whether platforms are bad — it's how they structure what's possible and who steps in when official systems don't.
"Addiction" frames this as individual pathology, but what's actually interesting is how platforms structure participation differently. TikTok's recommendation system prioritizes algorithmic discovery — content can reach millions without an established audience. Instagram rewards audience maintenance and aesthetic consistency within existing networks. I call this structured participation: the way platforms engineer the conditions of creativity through what's easy, visible, rewarding, and safe to post. These aren't just technical differences; they shape what content gets made, what creators can succeed, and what "going viral" means. I study what platforms actually reward with reach, which often diverges from what they say they value.
I wrote about this for Tech Policy Press with Rebecca Scharlach. Platform restrictions often create the conditions they aim to prevent — pushing users toward less regulated alternatives while leaving underlying dynamics of algorithmic amplification and data collection unaddressed. My research focuses on how platforms govern through both constraint (what they remove) and reward (what they amplify). That framework applies regardless of which company operates the platform or which country regulates it. The ban debate often treats platforms as black boxes when the more useful question is how their systems actually work.
Platforms communicate values publicly — safety, authenticity, quality — but their systems often optimize for something else. I distinguish four layers: stated values (explicit claims), latent values (implicit priorities in how policies are structured), enacted values (what moderation actually enforces), and revealed values (what algorithms reward with reach). My research in the International Journal of Cultural Studies analyzed TikTok's community guidelines to surface latent values like positivity, proactivity, and precision — priorities that aren't explicitly stated but shape governance in practice. When these layers diverge — when algorithms amplify content that violates stated policies — the gap becomes visible, and consequential.
Because platforms don't just host content — they structure participation differently. My research in Platforms & Society shows how TikTok and Instagram construct authenticity, creativity, and discovery in distinct ways, producing different conditions for cultural production. Users navigate these conditions through what I call memetic negotiation — developing platform-specific vernaculars that balance individual expression with collective trends, authenticity with strategy, innovation with convention. When memes cross platforms, they undergo memetic translation: structural adjustment to new formats, semiotic recalibration to new audience expectations, and cultural integration into the receiving platform's norms. Form, meaning, and who can succeed all shift.
Depends what you mean by "work." Platforms have improved at removing prohibited content — the enacted values layer. But removal is only half of governance; the other half is amplification. My work on revealed values shows that recommendation systems can massively amplify content violating every stated policy — billions of views on content platforms claim to prohibit. I'm analyzing EU Digital Services Act transparency data to examine what platforms actually enforce versus what they claim to prioritize. Accountability focused only on removal misses where value is allocated: in what gets reach.
Yes — and that's underexplored in platform research. My article in Social Media + Society examines "Look At This F*ckin' Street," an Instagram account that transformed pothole complaints into infrastructure activism in New Orleans. Using multimodal discourse analysis, I show how the account combined visual documentation, humor, and platform affordances to create accountability pressure that actually got streets fixed. Platforms aren't just spaces where culture happens or harm occurs — they enable new forms of collective action and civic engagement. Understanding how requires the same attention to platform structure I bring to studying content moderation or meme circulation.
Mixed methods: discourse analysis of platform policies, systematic walkthroughs of interfaces and features, multimodal analysis of content, and quantitative analysis of platform-scale data like DSA transparency reports. A core principle is triangulation — testing interpretive claims against measurement and refining concepts when they don't hold. I've published quantitative work in JMIR and my current projects analyze moderation patterns across platforms. I serve as Key Regional Leader for North America in the TikTok Cultures Research Network, building shared methodological resources for studying short-form video. My research has been cited in WIRED, Washington Post, Reuters, The Verge, and other outlets.