supernlpblog's profile picture. We blog about NLP, machine learning, and other stranger things.

Supernatural Language Processing

@supernlpblog

We blog about NLP, machine learning, and other stranger things.

Supernatural Language Processing reposted

Wild that "you don't need word order to understand English" is a common enough position in NLP that people need to publish papers showing that, contrary to common wisdom, "the dog ate the cat" does not mean the same thing as "the cat ate the dog"


Supernatural Language Processing reposted

Typological diversity is necessary but not sufficient for inclusive #NLProc. In an ACL 2022 theme track paper titled "Challenges and Strategies in Cross-Cultural NLP", we propose a multidimensional framework for thinking about cultural bias and adaptation arxiv.org/abs/2203.10020

daniel_hers's tweet image. Typological diversity is necessary but not sufficient for inclusive #NLProc. In an ACL 2022 theme track paper titled "Challenges and Strategies in Cross-Cultural NLP", we propose a multidimensional framework for thinking about cultural bias and adaptation
arxiv.org/abs/2203.10020

A (non-comprehensive) survey paper of research which links Neural Response measurements & Computational Models of language: arxiv.org/abs/2203.05300 Soon to become a blog post!


Supernatural Language Processing reposted

Connecting Neural Response measurements & Computational Models of language: a non-comprehensive guide arxiv.org/abs/2203.05300


Supernatural Language Processing reposted

We are excited to host Mostafa Abdou this week in our NLPhD speaker series. Mostafa will give a talk on: "Can LMs Encode Perceptual Structure Without Grounding? A Case Study in Color" 🔵🔴🟢 When? 📅 Tuesday (Feb. 8th, 14:30 CET) Where? 🌐 virtually on MS Teams Join us! 🔻

LstSaar's tweet image. We are excited to host Mostafa Abdou this week in our NLPhD speaker series. Mostafa will give a talk on: "Can LMs Encode Perceptual Structure Without Grounding? A Case Study in Color"  🔵🔴🟢

When? 📅 Tuesday  (Feb. 8th, 14:30 CET) 
Where? 🌐 virtually on MS Teams 

Join us! 🔻

Supernatural Language Processing reposted

My group at MPI-SWS is looking for postdocs, PhD students, and research interns interested in neuroscience + ML/NLP. Starting dates are flexible & full funding and benefits are provided. Shoot me an email if interested! Please RT to spread the word! 🙏


Supernatural Language Processing reposted

In our small reading group, we (@badr_nlp, @gossminn and Aria) enjoyed reading the paper "Can Language Models Encode Perceptual Structure Without Grounding? A Case Study in Color" An interesting paper by @supernlpblog 👏🏽👏🏽


Supernatural Language Processing reposted

Can language models encode the topology of colours from text only? yes! #conll2021 #EMNLP2021livetweet arxiv.org/pdf/2109.06129… Mostafa Abdou, Artur Kulmizev, @daniel_hers @stellaBotte @Brown_NLP Anders Søgaard

LChoshen's tweet image. Can language models encode the topology of colours from text only? yes!
#conll2021 #EMNLP2021livetweet 
arxiv.org/pdf/2109.06129…
Mostafa Abdou, Artur Kulmizev, @daniel_hers @stellaBotte @Brown_NLP Anders Søgaard

Supernatural Language Processing reposted

... is looking for new friends (for PhDs in #nlproc) candidate.hr-manager.net/ApplicationIni… See coastalcph.github.io for current members. #emnlp2021


Supernatural Language Processing reposted

Excited to share a new preprint from Artur and Joakim: Schrödinger's Tree -- On Syntax and Neural Language Models arxiv.org/abs/2110.08887. Let us know what you think.


Supernatural Language Processing reposted

Supernatural Language Processing reposted

More Postdoc/PhD positions available in #NLProc / #machinelearning. A great chance to work with these people: coastalcph.github.io Reach out ([email protected]) if you're interested.


Supernatural Language Processing reposted

Puzzled over how pretraining "transfers knowledge" from music (or genomics) to text? We challenge the "transfer learning" story altogether, showing that for summarization, we can get pretraining's benefits absent any actual upstream data. arxiv.org/abs/2109.04953

"Does Pretraining for Summarization Require Knowledge Transfer?" Short answer: most of the gains of T5 over a random init baseline can be realized absent any knowledge (by pretraining on procedurally generated babble)! Long answer: see thread... arxiv.org/abs/2109.04953



Supernatural Language Processing reposted

Exciting postdoc opportunities in 'NLP and cognitive neuroscience' and 'NLP and fairness'. Amazing collaborators (coastalcph.github.io) and a wonderful city (visitcopenhagen.com/node/1345). Reach out to [email protected], if you're interested.


Supernatural Language Processing reposted

Applications for ALPS 2022 are still open for a few days. This is an advanced training school in NLP with talks AND interactions with @kchonyc, @YejinChoinka, Mona Diab, @IGurevych, @PfeiffJo, @gneubig, @colinraffel, @zehavoc alps.imag.fr


Supernatural Language Processing reposted

#NLPaperAlert 📢: ‘Just What do You Think You’re Doing, Dave?’ A Checklist for Responsible Data Use in NLP (with @eltimster, @Kobotic): - summary of the principles of data use & where they don't align - proposal for a #NLProc data use checklist, akin to the reproducibility one /1

annargrs's tweet image. #NLPaperAlert 📢: ‘Just What do You Think You’re Doing, Dave?’ A Checklist for Responsible Data Use in NLP (with @eltimster, @Kobotic):
- summary of the principles of data use & where they don't align
- proposal for a #NLProc data use checklist, akin to the reproducibility one
/1

Loading...

Something went wrong.


Something went wrong.