From the creator of MythoMax, merges a suite of models to reduce word anticipation, ministrations, and other undesirable words in ChatGPT roleplaying data.

It combines Neural Chat 7B, Airoboros 7b, Toppy M 7B, Zepher 7b beta, Nous Capybara 34B, OpenHeremes 2.5, and many others.

#merge

Model Information

Model ID

gryphe/mythomist-7b

Context Length

32,768 tokens

Author

gryphe

Capabilities