URL Copied
Ho Chi Minh
Full-time

Senior Data Scientist, Adtima

As an advertising platform, Adtima distributes ad contents from thousands of advertisers to million users on Adtima’s ecosystem products. As a Data Scientist, you will involve in all the processes from data collection, cleaning and preprocessing to training models and deploying them to production. The ideal candidate will be passionate about Artificial Intelligence and stay up-to-date with the latest developments in the field.

What you will do

  • Extract, analyze and apply data-mining and statistical machine learning techniques to large structured and unstructured datasets; 
  • Design, develop and test large-scale data science pipeline and machine learning algorithms; 
  • Stay current on published state-of-the-art algorithms and competing technologies; 
  • Research and investigate academic and industrial data mining, machine learning and modeling techniques for product improvements; 
  • Design and analyze experiments to test new features & products; 
  • Report, visualize and communicate results; 
  • Work with product managers, designers and engineers to build new features and products; 
  • Implement and deliver high-quality features that are built for speed, scale and usability; 
  • End-to-end data processing, troubleshooting and problem diagnosis.

What you will need

  • Degree in Computer Science, Statistics, Applied Math or a related field;
  • Having domain expertise in Machine Learning, Probabilistic Graphical Models, Information Retrieval;
  • 2+ years relevant work experience with large amounts of REAL data;
  • Experience with traditional as well as modern statistical learning techniques, including: GLM, Support Vector Machines, Neural Networks, Regularization Techniques; Boosting, Random Forests and other Ensemble Methods;
  • Strong implementation experience with high-level languages such as: Python, R, Scala, Java or general programming languages;
  • Familiarity with Linux/ Unix/ Shell environments;
  • Strong hands-on skills in sourcing, cleaning, manipulating and analyzing large volumes of data;
  • Experience with end-to-end modeling projects emerging from research efforts;
  • Knowledge of Hadoop programming environments (e.g. Spark).