快捷搜索:  汽车  科技

美团商业分析专家(情感分析技术在美团的探索与应用)

美团商业分析专家(情感分析技术在美团的探索与应用)| 本文系美团技术团队出品,著作权归属美团。欢迎出于分享和交流等非商业目的转载或使用本文内容,敬请注明“内容转载自美团技术团队”。本文未经许可,不得进行商业性转载或者使用。任何商用行为,请发送邮件至tech@meituan.com申请授权。岗位职责岗位要求加分项

2021年5月,美团NLP中心开源了迄今规模最大的基于真实场景的中文属性级情感分析数据集ASAP,该数据集相关论文被自然语言处理顶会NAACL2021录用,同时该数据集加入中文开源数据计划千言,将与其他开源数据集一起推动中文信息处理技术的进步。本文回顾了美团情感分析技术的演进和在典型业务场景中的应用,包括篇章/句子级情感分析、属性级情感分析和观点三元组分析。在业务应用上,依托情感分析技术能力构建了在线实时预测服务和离线批量预测服务。截至目前,情感分析服务已经为美团内部十多个业务场景提供了服务。

美团商业分析专家(情感分析技术在美团的探索与应用)(1)

美团商业分析专家(情感分析技术在美团的探索与应用)(2)

美团商业分析专家(情感分析技术在美团的探索与应用)(3)

参考文献
  • [1] https://github.com/Meituan-Dianping/asap.
  • [2] Bu J Ren L Zheng S et al. ASAP: A Chinese Review Dataset Towards Aspect Category Sentiment Analysis and Rating Prediction. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2021.
  • [3] https://www.luge.ai/
  • [4] Zhang L. S. Wang and B. Liu . "Deep Learning for Sentiment Analysis : A Survey." Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery (2018):e1253.
  • [5] Liu Bing. "Sentiment analysis and opinion mining." Synthesis lectures on human language technologies 5.1 (2012): 1-167.
  • [6] Peng Haiyun et al. "Knowing what how and why: A near complete solution for aspect-based sentiment analysis." In Proceedings of the AAAI Conference on Artificial Intelligence. Vol. 34. No. 05. 2020.
  • [7] Zhang Chen et al. "A Multi-task Learning Framework for Opinion Triplet Extraction." In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: Findings. 2020.
  • [8] Yoon Kim. 2014. Convolutional neural networks for sentence classification. arXiv preprint arXiv:1408.5882.
  • [9] Peng Zhou Wei Shi Jun Tian Zhenyu Qi Bingchen Li Hongwei Hao and Bo Xu. 2016. Attention-based bidirectional long short-term memory networks for relation classification. In Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers) pages 207–212.
  • [10] Devlin Jacob et al. “Bert: Pre-training of deep bidirectional transformers for language understanding.” arXiv preprint arXiv:1810.04805 (2018).
  • [11] 杨扬、佳昊等. 美团BERT的探索和实践.
  • [12] Pontiki Maria et al. "Semeval-2016 task 5: Aspect based sentiment analysis." International workshop on semantic evaluation. 2016.
  • [13] Pontiki M. et al. "SemEval-2014 Task 4: Aspect Based Sentiment Analysis." In Proceedings of International Workshop on Semantic Evaluation at (2014).
  • [14] Yequan Wang Minlie Huang and Li Zhao. 2016. Attention-based lstm for aspect-level sentiment classification. In Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing pages 606–615.
  • [15] Sara Sabour Nicholas Frosst and Geoffrey E Hinton. 2017. Dynamic routing between capsules. In Advances in neural information processing systems pages 3856–3866.
  • [16] Chi Sun Luyao Huang and Xipeng Qiu. 2019. Utilizing bert for aspect-based sentiment analysis via constructing auxiliary sentence. arXiv preprint arXiv:1903.09588.
  • [17] Qingnan Jiang Lei Chen Ruifeng Xu Xiang Ao and Min Yang. 2019. A challenge dataset and effective models for aspect-based sentiment analysis. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP) pages 6281–6286.
  • [18] Wu Zhen et al. "Grid Tagging Scheme for End-to-End Fine-grained Opinion Extraction." In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: Findings. 2020.
  • [19] Liu Yinhan et al. "Roberta: A robustly optimized bert pretraining approach." arXiv preprint arXiv:1907.11692 (2019).
  • [20] Clark Kevin et al. "Electra: Pre-training text encoders as discriminators rather than generators." arXiv preprint arXiv:2003.10555 (2020). 0- [21] Timothy Dozat and Christopher D. Manning. 2017.Deep biaffine attention for neural dependency parsing. In 5th International Conference on Learning Representations ICLR 2017.
作者介绍

任磊、佳昊、张辰、杨扬、梦雪、马放、金刚、武威等,均来自美团平台搜索与NLP部NLP中心。

招聘信息

美团搜索与NLP部/NLP中心是负责美团人工智能技术研发的核心团队,使命是打造世界一流的自然语言处理核心技术和服务能力。

NLP中心长期招聘自然语言处理算法专家/机器学习算法专家,感兴趣的同学可以将简历发送至renlei04@meituan.com。具体要求如下。

岗位职责

  1. 预训练语言模型前瞻探索,包括但不限于知识驱动预训练、任务型预训练、多模态模型预训练以及跨语言预训练等方向;
  2. 负责百亿参数以上超大模型的训练与性能优化;
  3. 模型精调前瞻技术探索,包括但不限于Prompt Tuning、Adapter Tuning以及各种Parameter-efficient的迁移学习等方向;
  4. 模型inference/training压缩技术前瞻探索,包括但不限于量化、剪枝、张量分析、KD以及NAS等;
  5. 完成预训练模型在搜索、推荐、广告等业务场景中的应用并实现业务目标;
  6. 参与美团内部NLP平台建设和推广

岗位要求

  1. 2年以上相关工作经验,参与过搜索、推荐、广告至少其一领域的算法开发工作,关注行业及学界进展;
  2. 扎实的算法基础,熟悉自然语言处理、知识图谱和机器学习技术,对技术开发及应用有热情;
  3. 熟悉Python/Java等编程语言,有一定的工程能力;
  4. 熟悉Tensorflow、PyTorch等深度学习框架并有实际项目经验;
  5. 熟悉RNN/CNN/Transformer/BERT/GPT等NLP模型并有过实际项目经验;
  6. 目标感强,善于分析和发现问题,拆解简化,能够从日常工作中发现新的空间;
  7. 条理性强且有推动力,能够梳理繁杂的工作并建立有效机制,推动上下游配合完成目标。

加分项

  1. 熟悉模型训练各Optimizer基本原理,了解分布式训练基本方法与框架;
  2. 对于最新训练加速方法有所了解,例如混合精度训练、低比特训练、分布式梯度压缩等

| 本文系美团技术团队出品,著作权归属美团。欢迎出于分享和交流等非商业目的转载或使用本文内容,敬请注明“内容转载自美团技术团队”。本文未经许可,不得进行商业性转载或者使用。任何商用行为,请发送邮件至tech@meituan.com申请授权。

猜您喜欢: