| GPT2-nl output | GPT2-nl quantized output |
|---|---|
| De Belgische overheid heeft beslist dat een van haar topmensen in de gezondheidszorg (van de gezondheidsdienst) na | De Belgische overheid heeft beslist dat de Fyra de eerste Vlaamse fiets tot nu toe is die volledig voldoet aan de criteria van de |
| De Belgische overheid heeft beslist dat het gebruik van de nieuwe generatie cookies geen veiligheid of rechtens onschadelijk is | De Belgische overheid heeft beslist dat de Belgische nationale voetbalbond een beroep moet doen op de Europese voetbalbond. Dat zou betekenen dat de |
| Toen zei de agent plots dat er iets aan de hand was en hij bracht het in de richting van het | Toen zei de agent plots dat er geen sprake was van valsheid in geschrifte. Het bleek ook een fout van de |
| Toen zei de agent plots: "Nee, sorry, het wordt tijd, ik denk er nu over, | Toen zei de agent plots tegen mij: “Ik heb er nog nooit van gehoord,” zei hij toen zonder zich zorgen te maken |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| from onnxruntime import InferenceSession | |
| # Starting the session | |
| session_int8 = InferenceSession("gpt2_opt_int8.onnx") | |
| # Standard code to obtain tokenization outputs | |
| input_ids = ... | |
| attention_mask = ... | |
| position_ids = ... |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| from onnxruntime.transformers.quantize_helper import QuantizeHelper | |
| QuantizeHelper.quantize_onnx_model( | |
| "gpt2_opt.onnx", | |
| "gpt2_opt_int8.onnx") |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| from onnxruntime.transformers.gpt2_helper import Gpt2Helper | |
| Gpt2Helper.optimize_onnx( | |
| "gpt2.onnx", | |
| "gpt2_opt.onnx", | |
| False, | |
| model.config.num_attention_heads, | |
| model.config.hidden_size) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| import torch | |
| from transformers import AutoConfig | |
| from onnxruntime.transformers.gpt2_helper import Gpt2Helper, GPT2LMHeadModel, MyGPT2LMHeadModel | |
| # Define some configuration | |
| model_name_or_path = "gpt2" | |
| device = torch.device("cpu") | |
| # Load the model config and the model | |
| config = AutoConfig.from_pretrained(model_name_or_path) |
1. Lewis Hamilton overtakes Max Verstappen in closing laps to win Spanish GP. Hamilton extends title lead to 14 points with victory in Barcelona. Red Bull's Daniel Ricciardo finished second with Valtteri Bottas in third. Mercedes team-mate Kimi Raikkonen completed the podium in fourth.
2. at Circuit de Barcelona-Catalunya. World champions made a second pit stop for fresh tyres with 23 laps remaining to try and set up a grandstand finish. Their seven-time world champion did the rest to win the race for the first time in seven years.
3. "We were still here late most evenings discussing strategy and we had all the bases covered in that respect," he said. "Max is driving exceptionally well, as is Valtteri. It's so close between all of us it's going to take perfect delivery each weekend"
4. F1's computers predicted that Hamilton would catch Verstappen on the last lap. But he was soon lapping nearly two seconds faster than the Red Bull. Team-mate Valtteri Bottas cost Hamilton well over a second when
Victory for Lewis Hamilton and Mercedes in the Spanish Grand Prix was a salutary reminder of what a formidable combination the world champion and his team make. Hamilton’s controlled, relentless driving, allied to astute and bold strategy, has been one of the cornerstones of their success.
At the Circuit de Barcelona-Catalunya they gave notice that this mighty edifice is as strong and apparently impregnable as ever.
“We were still here late most evenings discussing strategy and we had all the bases covered in that respect. Of course it just meant I had to do the job on track and coming back from 21 seconds was not easy but it was the right call in the end.”
Losing their pole advantage almost at once, Hamilton and Mercedes orchestrated a superb comeback that left Red Bull unable to match their rivals. Red Bull have a competitive car this season and in Verstappen a driver unafraid to take on Hamilton and beat him. Here, however, they were reminded that they will also have to out-think their rivals
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| from transformers import pipeline | |
| model = "ml6team/gpt2-small-dutch-finetune-oscar" | |
| generator = pipeline( | |
| 'text-generation', | |
| device=0, | |
| model=f'{model}', | |
| tokenizer=f'{model}') | |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| learn.fit_one_cycle(1, 2e-3) | |
| learn.freeze_to(-2) | |
| learn.fit_one_cycle(1, slice(1e-3/(2.6**4),1e-3)) | |
| learn.freeze_to(-3) | |
| learn.fit_one_cycle(1, slice(5e-4/(2.6**4),5e-4)) | |
| learn.export(path_result/'dutch_gpt2_model') |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| from transformers import GPT2LMHeadModel | |
| def splitter(model): | |
| "Split a GPT2 `model` in 3 groups for differential learning rates." | |
| # First layers group : decoder blocks from 0 to 3 | |
| modules = [] | |
| for i in range(4): modules.append(model.transformer.h[i]) | |
| groups = [nn.Sequential(*modules)] |
NewerOlder