Sitemap

A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.

Pages

Posts

Stability and scalability assessment of KubeVela

less than 1 minute read

Published:

This is an article published at CNCF blog. It includes a comprehensive assessment of the KubeVela project, including the load testing and performance optimization details.

KubeVela: the road to cloud native application and platform engineering

less than 1 minute read

Published:

This is an article published at CNCF blog. It discussed the application delivery challenges in today’s cloud native era. It also shows the history path that how KubeVela evolves with the OAM (open application model) to tackle these challenges.

The multicluster architecture in cloud native era

less than 1 minute read

Published:

This is a Chinese article posted on InfoQ which discussed the multicluster technique on Kubernetes and its challenges and possible solutions in the cloud native era.

portfolio

publications

MRT: Tracing the Evolution of Scientific Publications

Published in IEEE Transactions on Knowledge and Data Engineering, 2021

This paper is about mining scientific data with machine learning methods.

Recommended citation: Yin, Da & Tam, Weng & Ding, Ming & Tang, Jie. (2021). MRT: Tracing the Evolution of Scientific Publications. IEEE Transactions on Knowledge and Data Engineering. PP. 1-1. 10.1109/TKDE.2021.3088139. http://keg.cs.tsinghua.edu.cn/jietang/publications/TKDE21-Yin-et-al-MRT-Tracing-the-Evolution-of-Scientific-Publications.pdf

Controllable Generation from Pre-trained Language Models via Inverse Prompting

Published in The 27th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, 2021

This paper is about using heuristic methods to improve the quality of generated texts via large language models.

Recommended citation: Zou, Xu & Yin, Da & Zhong, Qingyang & Yang, Hongxia & Yang, Zhilin & Tang, Jie. (2021). Controllable Generation from Pre-trained Language Models via Inverse Prompting. 2450-2460. 10.1145/3447548.3467418. http://keg.cs.tsinghua.edu.cn/jietang/publications/KDD21-Zou-et-al-Controllable-Generation-from-Pre-trained-Language-Models-via-Inverse-Prompting.pdf

OAG-BERT: Towards a Unified Backbone Language Model for Academic Knowledge Services

Published in The 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, 2022

This paper is about training large language model on top of scientific data.

Recommended citation: Liu, Xiao & Yin, Da & Zheng, Jingnan & Zhang, Xingjian & Zhang, Peng & Yang, Hongxia & Yuxiao, Dong & Tang, Jie. (2022). OAG-BERT: Towards a Unified Backbone Language Model for Academic Knowledge Services. 3418-3428. 10.1145/3534678.3539210. http://keg.cs.tsinghua.edu.cn/jietang/publications/KDD22-Liu-et-al-OAG-BERT.pdf

talks

Integrating Jenkins with KubeVela

Published:

This is a demo talk presenting how to integrate Jenkins with KubeVela to automate the continuous integration and delivery process of applications.

KubeVela Introduction at CD Foundation

Published:

This is a talk presented on CD Foundation’s community meeting on behalf of the KubeVela team. It introduces KubeVela’s basic concepts and design principles. A demo is included to show how to integrate KubeVela with third-party projects.

KubeVela Office Hour on KubeCon NA 2022

Published:

This talk introduces the basic concepts of KubeVela and how it works. A live demo is also included which shows how to integrate KubeVela with third-party projects, such as observability tools.

KubeVela Introduction at TAG App Delivery

Published:

A KubeVela introduction is presented at the general meeting of the TAG app delivery, focusing the basic ideas and functionalities of the KubeVela.

teaching

Teaching experience 1

Undergraduate course, University 1, Department, 2014

This is a description of a teaching experience. You can use markdown like any other post.

Teaching experience 2

Workshop, University 1, Department, 2015

This is a description of a teaching experience. You can use markdown like any other post.