Vous êtes sur la page 1sur 4

J.

21 1039

CONCOURS EXTERNE
DE CONTRÔLEUR DES FINANCES PUBLIQUES DE 2ÈME CLASSE
AFFECTÉ AU TRAITEMENT DE L’INFORMATION EN QUALITÉ DE
PROGRAMMEUR

ANNÉE 2021

_____

ÉPREUVE ÉCRITE D’ADMISSIBILITÉ N° 3

Durée : 1 heure 30 – Coefficient : 1

_____

Traduction d’un texte en anglais issu d’une revue ou d’une documentation informatique

_____

Seuls sont pris en compte les points obtenus au-dessus de 10/20.

_____

Recommandations importantes
Le candidat trouvera au verso la manière de servir la copie dédiée.
Sous peine d’annulation, en dehors du volet rabattable d’en-tête, les copies doivent être
totalement anonymes et ne comporter aucun élément d’identification tels que nom, prénom,
signature, paraphe, localisation, initiale, numéro ou toute autre indication, même fictive,
étrangère au traitement du sujet.
Sur les copies, les candidats devront écrire et souligner si nécessaire au stylo bille, plume ou
feutre de couleur noire ou bleue uniquement. De même, l’utilisation de crayon surligneur est
interdite.
Il devra obligatoirement se conformer aux directives données.

Tournez la page S.V.P.


Le candidat complétera l’intérieur du volet rabattable des informations demandées
et se conformera aux instructions données

Nom de naissance

Prénom usuel

Jour, mois et année

Signature
Numéro de obligatoire
candidature

Externe
Contrôleur Programmeur
des Finances Publiques

3 Préciser éventuellement le nombre


d’intercalaires supplémentaires
051 - Anglais

0 8 0 3 2 0 2 1

Suivre les instructions


données pour les étiquettes
d’identification

EN AUCUN CAS, LE CANDIDAT NE FERMERA LE VOLET RABATTABLE AVANT D’Y AVOIR


ÉTÉ AUTORISÉ PAR LA COMMISSION DE SURVEILLANCE
-2-
ANGLAIS
Code matière : 051

Les candidats et candidates peuvent avoir à leur disposition sur la table de concours le matériel
d’écriture, une règle, des surligneurs.

Natural language processing explained

We talk to our devices, and sometimes they recognize what we are saying correctly. We use free
services to translate foreign language phrases encountered online, and sometimes they give us an
accurate translation. Although natural language processing has been improving by leaps and
bounds, it still has considerable room for improvement.
What is natural language processing ?
Natural language processing, or NLP, is currently one of the major successful application areas for
deep learning, despite stories about its failures. The overall goal of natural language processing is to
allow computers to make sense of and act on human language.
Historically, natural language processing was handled by rule-based systems, initially by writing
rules for, e.g., grammars and stemming. Aside from the sheer amount of work it took to write those
rules by hand, they tended not to work very well.
After pretty much giving up on hand-written rules in the late 1980s and early 1990s, the NLP
community started using statistical inference and machine learning models.
Phrase-based statistical machine translation models still needed to be tweaked for each language
pair, and the accuracy and precision depended mostly on the quality and size of the textual corpora
available for supervised learning training.
In the fall of 2016, Google Translate suddenly went from producing, on the average, “word salad”
with a vague connection to the meaning in the original language, to emitting polished, coherent
sentences more often than not, at least for supported language pairs such as English-French,
English-Chinese, and English-Japanese.
That dramatic improvement was the result of a nine-month concerted effort by the Google Brain
and Google Translate teams to revamp Google Translate from using its old phrase-based statistical
machine translation algorithms to using a neural network trained with deep learning and word
embeddings using Google’s TensorFlow framework. Within a year neural machine translation
(NMT) had replaced statistical machine translation (SMT) as the state of the art.
Was that magic ? No, not at all. It wasn’t even easy. The researchers working on the conversion had
access to a huge corpus of translations from which to train their networks, but they soon discovered
that they needed thousands of GPUs for training, and that they would need to create a new kind of
chip, a Tensor Processing Unit (TPU), to run Google Translate on their trained neural networks at
scale.
InfoWorld, May 29, 2019

-3-
I M P R I M E R I E N A T I O N A L E – 21 1039 – D’après documents fournis

Vous aimerez peut-être aussi