This study aims to investigate the demand for big data-related jobs in the Chinese job market, by applying the Latent Dirichlet Allocation (LDA) topic modeling algorithm in Natural Language Processing (NLP). Using “big data” as keywords, the dataset of job postings is collected from Zhaopin.com, which is one of the most popular online recruitment platforms in China. I further implemented a series of textual data preprocessing techniques to tidy the raw data for further unsupervised NLP modeling and visualizations. By analyzing the most popular job titles, required skills, education backgrounds and experience levels, salary differences by cities and requirements, this paper provides implications for job seekers and employers in this promising field of big data. There are four key findings in this study. Firstly, the most popular job titles are of different levels from entry to senior ones. Accordingly, different skills are required for various positions. In addition, I find regional divergence in terms of average salaries, where positions in first-tier cities and new first-tier cities are paid with higher salaries. Moreover, although big data-related enterprises generally prefer talents with relatively higher education and experience levels, the threshold for such positions is not high. Last but not least, I expect further studies that leverage innovative methodologies to investigate the ever-growing job market of big data talents in China and around the world, and provide valuable insights for this promising field.