The Liber Primus is unsolved to this day. A book of 58 pages written in Runes, of which, its bewildering encryption continues to haunt hacker gunslingers around the globe who choose only to communicate and study its content via IRCs (internet chat relays).

The cryptic book arrived on the internet in the mid 2010’s by the now wildly popular but mysterious internet group 3301. While the group’s identity remains hidden, it is speculated they are a remnant of the cypherpunk activist movement (birthed somewhere out of Berkley in the 80s). At least this is the most plausible explanation given to us by one of the few known hackers that’s made it inside the clandestine group — Marcus Wanner. But who knows…

3301’s Cicada project started with a random 4chan post in 2012 leading many thrill seekers, with a cult-like following, on a puzzle hunt that encompassed everything from steganography to cryptography. While most of their puzzles were eventually solved, the very last one, the Liber Primus, is still (mostly) encrypted. The last known comms from 3301 came in April 2017 via Pastebin post. It reads:

Message from 3301/Cicada - Pastebin.com

FYI, there’s a standard PGP (pretty good privacy) key for all 3301 posts. If you see a 3301 online post without their PGP signature, don’t trust it (plenty of troll accounts to be found).

For a Summary/Timeline:

Uncovering Cicada Wiki

NEW USERS, PLEASE READ THIS FAQ IF YOU DON’T KNOW WHAT PGP IS CLICK HERE

uncovering-cicada.fandom.com

Visit Nox’s YouTube channel if you are interested in understanding how they cracked previous Cicada puzzles ante-Liber Primus.

Meanwhile back at the ranch…

I luckily found my way in creating a training script for adapters (the modular add-ons discussed in last week’s blog). The script works for the GLUE datasets. Will keep everyone updated as new events unfold regarding the AdapterHub. Very excited about this new framework, once again thanks to Jonas for nudging me in the right direction.

Stay Frosty ✌✌


This Week

SimpleTOD

TurboTransformers

NLP & Audio Pretrained Models

NERtwork

AllenNLP Library Step-by-Step

Search Engining is Hard Bruh

Dataset of the Week: ODSQA


SimpleTOD

Previous task oriented dialogues, especially from those chatbots we all dream of one day building, are built using a standard modular pipeline (similar to what you find in the RASA framework). However, Salesforce Research has recently released a unidirectional language model called SimpleTOD, that attempts to solve all the sub-tasks in an end-to-end manner. It was built with Transformers on the MultiWOZ dataset.

Blog:

SimpleTOD: A Simple Language Model for Task-Oriented Dialogue

We propose recasting task-oriented dialogue as a simple, causal (unidirectional) language modeling task. We show that…

blog.einstein.ai

Paper

GitHub:

salesforce/simpletod

Authors: Ehsan Hosseini-Asl, Bryan McCann, Chien-Sheng Wu, Semih Yavuz, and Richard Socher Task-oriented dialogue (TOD)…

github.com

TurboTransformers

A recent transformer runtime library, TurboTransformers, for inference came to my attention. This library optimizes what everyone wants in production, lower latency. They claim:

It brings 1.88x acceleration to the WeChat FAQ service, 2.11x acceleration to the public cloud sentiment analysis service, and 13.6x acceleration to the QQ recommendation system.

The sell is that it can support various lengths of input sequences without preprocessing which reduces overhead in computation. 🧐

#ai #machine-learning #artificial-intelligence #deep-learning #nlp #deep learning

NLP News Cypher | 07.26.20
1.50 GEEK