Please use this identifier to cite or link to this item: https://oxfordhealth-nhs.archive.knowledgearc.net/handle/123456789/633
Full metadata record
DC FieldValueLanguage
dc.contributor.authorExternal author(s) only-
dc.date.accessioned2020-11-12T18:30:42Z-
dc.date.available2020-11-12T18:30:42Z-
dc.date.issued2020-10-
dc.identifier.citationJohn Pougue Biyong, Bo Wang, Terry Lyons, Alejo J Nevado-Holgado. Information Extraction from Swedish Medical Prescriptions with Sig-Transformer Encoder. arXiv:2010.04897v1 [cs.CL] 10 Oct 2020en
dc.identifier.urihttps://oxfordhealth-nhs.archive.knowledgearc.net/handle/123456789/633-
dc.description.abstractRelying on large pretrained language models such as Bidirectional Encoder Representations from Transformers (BERT) for encoding and adding a simple prediction layer has led to impressive performance in many clinical natural language processing (NLP) tasks. In this work, we present a novel extension to the Transformer architecture, by incorporating signature transform with the self-attention model. This architecture is added between embedding and prediction layers. Experiments on a new Swedish prescription data show the proposed architecture to be superior in two of the three information extraction tasks, comparing to baseline models. Finally, we evaluate two different embedding approaches between applying Multilingual BERT and translating the Swedish text to English then encode with a BERT model pretrained on clinical notes.en
dc.description.sponsorshipSupported by the NIHRen
dc.description.urihttps://arxiv.org/abs/2010.04897en
dc.language.isoenen
dc.subjectNatural Language Processingen
dc.titleInformation Extraction from Swedish Medical Prescriptions with Sig-Transformer Encoderen
dc.typeArticleen
Appears in Collections:Managing knowledge and information

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.