Responsibilities

  • Data Pipeline Development: Design, implement, and maintain data analytics pipelines and processing systems.
  • Data Modeling: Apply data modeling techniques and integration patterns to ensure data consistency and reliability.
  • Data Transformation: Write data transformation jobs through code to optimize data processing.
  • Data Management: Perform data management through data quality tests, monitoring, cataloging, and governance.
  • LLM Integration: Design and integrate LLMs into existing applications, ensuring smooth functionality and performance.
  • Model Development and Fine-Tuning: Develop and fine-tune LLMs to meet specific business needs, optimizing for accuracy and efficiency.
  • Performance Optimization: Continuously optimize LLM performance for speed, scalability, and reliability.
  • Infrastructure Knowledge: Possess knowledge of the data and AI infrastructure ecosystem.
  • Collaboration: Collaborate with cross-functional teams to identify opportunities to leverage data to drive business outcomes.
  • Continuous Learning: Demonstrate a willingness to learn and find solutions to complex problems.

Qualifications

  • Education: Bachelor's or Master's degree in Computer Science, AI, Engineering, or a related field.
  • Experience: At least 2 years of experience in data engineering and at least 3 years as data scientist.
  • Technical Skills: Proficiency in Python, SQL, Java, experience with LLM frameworks (e.g., LangChain), and familiarity with cloud computing platforms. Additional, visualization tools i.e Power BI, Tableau, Looker, Qlik
  • Cloud Computing: Familiarity with cloud computing platforms, such as GCP, AWS, or Databricks.
  • Problem-Solving: Strong problem-solving skills with the ability to work independently and collaboratively.


Desirable

  • System Design: Knowledge of system design and platform thinking to build sustainable solutions.
  • Big Data Experience: Practical experience with modern and traditional Big Data stacks (e.g., BigQuery, Spark, Databricks, duckDB, Impala, Hive).
  • Data Warehouse Solutions: Experience working with data warehouse solutions, ELT tools, and techniques (e.g., Airflow, dbt, SAS, Nifi).
  • API Development: Experience with API design to facilitate integration of LLMs with other systems.
  • Prompt Engineering: Skills in designing sequential tasks for LLMs to achieve efficient and accurate outputs.
  • Visualization Solution: Skills in design and develop dashboard for analytic & insight
  • Agile Methodologies: Experience with agile software delivery and CI/CD processes.

Location: BTS Ekkamai
Working Day: Mon-Fri (WFA Every Friday)

Experience required
  • any or no experience
Salary
  • Negotiable
Job function
  • Engineering
Job type
  • Full-time

Company overview

Size:2000-5000 employees
Industry:Retail
Location:Bangkok
Website:corporate.bigc.co.th/
Founded in:1994
Ranking:4/5

Big C Supercenter Public Co., Ltd. is one of the leading retailers in Thailand and Southeast Asia. We have more than 30,000 employees working in our Headquarters Office in Bangkok (Rajdamri) and in Big C stores throughout Thailand. As a subsidiary of Berli Jucker Public Co., Ltd., we are a true m ...

Read more

Why join us:

DEVELOPMENT PROJECT FOR POTENTIAL PERSONNEL Believing that the success is when our people is a great, Big C has assigned budget for potential personnel development project with a primary aim to bring employees who are qualified and ambitious to achieve their goal and to maintain them within our o ...

Read more

Job location: Khlong Toei
Head office: 97/11 FL.6 RATCHADAMRI ROAD (BTS Chidlom)
Display map
WorkVenture gives you an inside look at what it's like to work at Big C Supercenter Co., Ltd., office & team photos, reviews and more. This is the Big C Supercenter Co., Ltd. Company Page. All content is posted anonymously by employees currently or previously working at Big C Supercenter Co., Ltd..Apply to Sundae SolutionsApply to MetaApply to SIPAApply to Lotte