Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using tool to validate semi-automatically created question/answers pairs #33

Open
almugabo opened this issue Aug 28, 2019 · 1 comment

Comments

@almugabo
Copy link

This tool has the potential to encourage sharing more data on QA from various domains.
One idea I think it is worth looking into, is to combine it with semi-automatically created questions/answers pairs (and documents) and use it to validate them.
For example by pre-populating the questions/answers fields.

In the excellent video tutorial
https://www.youtube.com/watch?v=YhVgl70Tn_k
it is mentioned at the end an open source system which - if I understood it correctly - could help in this. [Close?)
Can someone share the link and we can explore if it can be used for that end ?

@fmikaelian
Copy link
Collaborator

fmikaelian commented Sep 1, 2019

Hi @almugabo

In the tutorial @andrelmfarias is talking about UnsupervisedQA, a project released recently by Facebook. The idea is to create basic question-answer pairs automatically, allowing to accelerate the creation of QA datasets.

It would be very interesting indeed to add pre-propulation & validation to the cdQA-annotator to speed up the process of annotation. Aside UnsupervisedQA, we could also add a pre-trained reader model in the loop to suggest answers to the user annotating (you would still have to write a question though).

Any help on this would be very welcomed! 😃

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants