News
- Mar 25th, 2024: Task Released.
- Jun 5th, 2024 Baseline Released, Click here
Frequently Asked Questions
Q: Detailed information about the certificates?
A: The top 3 teams will be awarded certification from NLPCC and CCF-NLP. For more information, please visit the official NLPCC website: NLPCC Website
Q: Detailed information about paper submission?
A: The top 5 teams will be invited to submit papers to the NLPCC conference. The submitted papers will undergo a review process, and if they meet NLPCC's publication acceptance standards, they will be published in the Shared Task Track of NLPCC 2024.
Q: Can you explain the rule "prohibit the use of large models with more than 1 billion parameters for training and prediction?"
A: This means that the parameter count of a single pre-trained language model used in your model cannot exceed 1 billion. There is no limit on the parameter count of your own model.
ORGANIZERS
Bobo Li
Language and Cognition Computing Laboratory, Wuhan University
NeXT++ Research Center, National University of Singapore