Library

Collected works on compression, memory systems, and semantic optimization by

Hypernym
.

Table of Contents

All entries: original works, annotations, and references curated for long-term retention.

Hypernym
Joins Llama Startup Program

June 16, 2025

H

ypernym
is now part of Meta's Llama Startup Program, supporting early-stage innovation in the LLM ecosystem. This partnership opens new avenues for research collaboration and resource sharing within the broader AI community.

01
News
Marginalia
First major partnership
Opens research channels

Library Launch + Entry Filters Added

June 15, 2025
T

his page now includes tagged entries, third-party research, and filterable categories for easier browsing. The manuscript-style interface reflects our commitment to treating knowledge as a living document.

02
Updates
Marginalia
UI overhaul
Book metaphor implemented

LLM Fingerprinting

June 2, 2025
Luiza Corpaci, Chris Forrester, Siddhesh Pawar
I

dentifying and verifying LLM outputs via persistent token-level markers. A submission to the Apart Research Sprint exploring novel approaches to AI output authentication and traceability.

03
Research
Marginalia
Sprint submission
Novel authentication method

From Tokens to Thoughts: How LLMs and Humans Trade Compression for Meaning

May 26, 2025
Chen Shani, Dan Jurafsky, Yann LeCun, Ravid Shwartz-Ziv
A

Meta–Stanford collaboration comparing semantic compression strategies between humans and LLMs. This work provides crucial insights into the fundamental differences in how biological and artificial systems process and compress information.

04
Relevant
Marginalia
Foundational work
Human-AI comparison

Hypernym
Mercury: Token Optimization Through Semantic Field Constriction

May 14, 2025
Chris Forrester, Octavia Sulea
A

novel (patent-pending) method for semantic compression with controllable granularity and 90%+ token reduction. Benchmarked on Dracula, this work establishes the theoretical foundation for our compression approach.

05
Research
Marginalia
Core methodology
Patent pending
Dracula benchmark