You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
|[Quick tour: Fine-tuning/usage scripts](#quick-tour-of-the-fine-tuningusage-scripts)| Using provided scripts: GLUE, SQuAD and Text generation |
26
27
|[Migrating from pytorch-pretrained-bert to pytorch-transformers](#Migrating-from-pytorch-pretrained-bert-to-pytorch-transformers)| Migrating your code from pytorch-pretrained-bert to pytorch-transformers |
@@ -68,6 +69,14 @@ It contains an example of a conversion script from a Pytorch trained Transformer
68
69
At some point in the future, you'll be able to seamlessly move from pre-training or fine-tuning models in PyTorch to productizing them in CoreML,
69
70
or prototype a model or an app in CoreML then research its hyperparameters or architecture from PyTorch. Super exciting!
70
71
72
+
## Online demo
73
+
74
+
**[Write With Transformer](https://transformer.huggingface.co)**, built by the Hugging Face team at transformer.huggingface.co, is the official demo of this repo’s text generation capabilities.
75
+
You can use it to experiment with completions generated by `GPT2Model`, `TransfoXLModel`, and `XLNetModel`.
76
+
77
+
> “🦄 Write with transformer is to writing what calculators are to calculus.”
@@ -95,7 +104,7 @@ for model_class, tokenizer_class, pretrained_weights in MODELS:
95
104
model = model_class.from_pretrained(pretrained_weights)
96
105
97
106
# Encode text
98
-
input_ids = torch.tensor([tokenizer.encode("Here is some text to encode")])
107
+
input_ids = torch.tensor([tokenizer.encode("Here is some text to encode", add_special_tokens=True)])# Add special tokens takes care of adding [CLS], [SEP], <s>... tokens in the right way for each model.
99
108
with torch.no_grad():
100
109
last_hidden_states = model(input_ids)[0] # Models outputs are now tuples
0 commit comments