- No elements found. Consider changing the search query.


Skills:
Automation, Power BI, Tableau, English
Job type:
Full-time
Salary:
negotiable
- Develop data pipeline automation using Azure technologies, Databricks and Data Factory.
- Understand data, reports and dashboards requirements, develop data visualization using Power BI, Tableau by working across workstreams to support data requirements including reports and dashboards.
- Analyze and perform data profiling to understand data patterns following Data Quality and Data Management processes.
- Proof of concept and Test solutions of ETL tools for customer relationship management.
- Develop and maintain customer profile data service using Grails framework, Apache Hadoop, Shell script, and Impala-shell.
- Establishes requirements and coordinates production with programmers to control the solution.
- Defines application problems by conferring with users and analyzing procedures and processes.
- Writes documentation such as a technical specification, troubleshooting, and application log to serve as a reference.
- 3 years+ experience in big data technology, data engineering, data science, data analytic application system development.
- Have an experience of unstructured data for business Intelligence or computer science would be advantage.
- Java, Groovy, JavaScript, Perl, Shell Script.
- Grails Framework, Catalyst Framework, Nodejs.
- MySQL, MongoDB, MariaDB, Apache Hadoop, Impala.
- Documentation, testing, and maintenance.
- Intelli J IDEA, Visual Studio Code, Postman, RoboMongo, MobaXterm, WinSCP.
- English communication.
- Fast learner, Creativity and Team player.
Skills:
Big Data, ETL, SQL
Job type:
Full-time
Salary:
negotiable
- Develop and maintain robust data pipelines to ingest, process, and transform raw data into formats suitable for LLM training.
- Conduct meeting with users to understand the data requirements and perform database design based on data understanding and requirements with consideration for performance.
- Maintain data dictionary, relationship and its interpretation.
- Analyze problem and find resolution, as well as work closely with administrators to monitor performance and advise any necessary infrastructure changes.
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Develop big data solutions for batch processing and near real-time streaming.
- Own end-to-end data ETL/ELT process framework from Data Source to Data warehouse.
- Select and integrate appropriate tools and frameworks required to provide requested capabilities.
- Design and develop BI solutions.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Keep abreast of new developments in the big data ecosystem and learn new technologies.
- Ability to effectively work independently and handle multiple priorities.
- Bachelor degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- 3+ year's experiences in Data Management or Data Engineer (Retail or E-Commerce business is preferrable).
- Expert experience in query language (SQL), Databrick SQL, PostgreSQL.
- Experience in Big Data Technologies like Hadoop, Apache Spark, Databrick.
- Experience in Python is a must.
- Experience in Generative AI is a must.
- Knowledge in machine/statistical learning, data mining is a plus.
- Strong analytical, problem solving, communication and interpersonal skills.
- Having good attitude toward team working and willing to work hard.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
Skills:
Data Analysis, ETL, Data Warehousing
Job type:
Full-time
Salary:
negotiable
- Data Architecture: Design, develop, and maintain the overall data architecture and data pipeline systems to ensure efficient data flow and accessibility for analytical purposes.
- Data Integration: Integrate data from multiple sources, including point-of-sale systems, customer databases, e-commerce platforms, supply chain systems, and other relevant data sources, ensuring data quality and consistency.
- Data Modeling: Design and implement data models that are optimized for scalability, ...
- Data Transformation and ETL: Develop and maintain efficient Extract, Transform, and Load (ETL) processes to transform raw data into a structured format suitable for analysis and reporting.
- Data Warehousing: Build and maintain data warehouses or data marts that enable efficient storage and retrieval of structured and unstructured data for reporting and analytics purposes.
- Data Quality and Monitoring: Implement data quality checks and monitoring mechanisms to identify and resolve data inconsistencies, anomalies, and issues in a timely manner.
- Performance Optimization: Optimize data processing and query performance to ensure efficient data retrieval and analysis, considering factors such as data volume, velocity, and variety.
- Bachelor's or master's degree in computer science, information systems, or a related field.
- Strong programming skills in languages such as Python, SQL. C++ is plus.
- At least 5 year experience with data modeling, database design, and data warehousing concepts.
- Proficiency in working with relational databases (e.g., MySQL, PostgreSQL) and big data technologies (e.g., Hadoop, Spark, Hive).
- Familiarity with cloud-based data platforms, such as AWS.
- Knowledge of ETL tools and techniques for data integration and transformation.
- Location: BTS Ekkamai
- Working Day: Mon-Fri (WFA Every Friday).
Skills:
Research, Automation, Statistics
Job type:
Full-time
Salary:
negotiable
- Work on Data Architecture. They use a systematic approach to plan, create, and maintain data architectures while also keeping it aligned with business requirements.
- Collect Data. Before initiating any work on the database, they have to obtain data from the right sources. After formulating a set of dataset processes, data engineers store optimized data.
- Conduct Research. Data engineers conduct research in the industry to address any issues that can arise while tackling a business problem.
- Automate Tasks. Data engineers dive into data and pinpoint tasks where manual participation can be eliminated with automation.
- Automate Tasks. Data engineers dive into data and pinpoint tasks where manual participation can be eliminated with automation.
- Bachelor Degree in IT, computer science, statistics, mathematics, business, or related field.
- Minimum of 3 years' experience in data engineer roles.
- Experience in the data analytics lifecycle including problem identification, measurement/matrix, exploratory data analysis and data insight presentation.
- Experience with data tools and languages like CLOUD, Python, Java similar.
- Experience with data processing, ETL and workflow and messaging queue like Kafka.
- Data Warehousing.
- Data Structure.
- ETL Tools Programming Languages (Python, Java, Pyspark).
Experience:
5 years required
Skills:
Data Analysis, Python, SQL, English
Job type:
Full-time
Salary:
negotiable
- Analyze and organize raw data.
- Combine raw information from different sources.
- Designing and building data models to support business requirements.
- Developing and maintaining data ingestion and processing systems.
- Implementing data storage solutions. (databases and data lakes).
- Ensuring data consistency and accuracy through data validation and cleansing techniques.
- Conduct complex data analysis and report on results.
- Explore ways to enhance data quality and reliability.
- Working together with cross-functional teams to identify and address data-related issues.
- Writes unit/integration tests, contributes to engineering wiki, and documents work.
- Bachelor or Master Degree in Computer Science, Software Engineering, Computer Engineering ICT, IT or any related technical field.
- At least 5 years of experience as a data engineer or in a similar role.
- Experience with schema design and dimensional data modeling.
- Experience and knowledge in Python development.
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Experience building and optimizing data pipelines, architectures and data sets.
- Experience designing, building, and maintaining data processing systems.
- Experience with orchestration tools e.g. batch and real-time data processing.
- Experience with CICD pipeline data.
- Experience with big data.
- Familiarity with data integration and ETL tools.
- Strong problem-solving and analytical skills.
- Able to speak Thai fluently and basic command in English.
Skills:
Automation, Research, Assurance
Job type:
Full-time
Salary:
negotiable
- Develop efficient and high-quality test strategies, automation frameworks, and test cases to ensure software reliability, meeting project timelines and quality standards.
- Design and document clear and comprehensive test plans (including functional, integration, and performance testing) to facilitate effective communication and collaboration across teams.
- Optimize test execution efficiency, enhance automation coverage, and research emergi ...
- Conduct thorough test reviews and defect analysis, providing actionable feedback to improve software quality and drive continuous improvement within the team.
- Minimizes the risk of software defects and security vulnerabilities by maintaining robust testing methodologies, enhancing automation reliability, and improving bug detection accuracy.
- Drive end-to-end testing efforts from test planning to execution, ensuring smooth deployment and ongoing quality assurance in a production environment.
- At least 3 years of hands-on experience in the entire software testing lifecycle, including test planning, automation development, execution, and defect management in a production environment.
- Proficient in Playwright with TypeScript, with experience in developing and maintaining robust automation frameworks for UI and API testing. Ability to write clear, reusable, and maintainable test scripts.
- Strong knowledge of performance testing using K6, including designing load tests, stress tests, and endurance tests to measure system performance under high concurrency. Ability to analyze test results.
- Strong understanding of software testing methodologies, including functional, regression, integration, and exploratory testing.
- Excellent analytical and debugging skills, with the ability to investigate test failures, provide detailed bug reports, and collaborate with developers for resolution.
- Experience with event-driven architectures, ensuring thorough validation of message queues such as RabbitMQ.
- Knowledge of database validation, with a focus on data type consistency, data migrations, and schema evolution. Experience testing applications migrating between different databases (e.g., MongoDB to PostgreSQL or cloud-based solutions) is a plus.
- Familiarity with CI/CD pipelines, ensuring automated tests are seamlessly integrated into DevOps workflows for continuous testing and deployment.
- Remark: Given the nature of the mentioned position, where employees are involved with customer data and asset values, and/or the company, to comply with legal and regulatory standards established by the Securities and Exchange Commission, as well as to align with laws and overseeing agencies, the company requires a criminal background check as part of the post-interview process before joining the company. Your criminal history information will be retained for a period of 6 months from the start date.
- Important: Candidate Privacy Policy.
- สำคัญ:โปรดอ่านและทำความเข้าใจ: นโยบายความเป็นส่วนตัวด้านทรัพยากรบุคคล สำหรับผู้สมัครงาน และผู้สมัครเข้าฝึกงาน*..
- Don't forget to 'Like' and 'Follow' our social media channels so you won't miss any news from us. Click.
Skills:
Cost Analysis, Quantity Surveying, Electrical Engineering, English
Job type:
Full-time
Salary:
negotiable
- Preparing tender and contract documents.
- Undertaking cost analysis for maintenance and repair work.
- Measuring and valuing the work performed on-site.
- Writing detailed progress reports.
- Forecasting the cost of materials required for the project.
- Procuring the services of contractors and subcontractors, and ensuring they get paid. Identifying, analysing, and developing the necessary responses to any commercial risks. Advising on the maintenance costs of specific buildings.
- Liaising with construction professionals and clients, such as site engineers and managers.
- Knowledge in Cost Management / Quantity Surveying will be an advantage.
- Experience in MEP Estimating will be an advantage.
- Degree in Mechanical/Electrical Engineering.
- Good verbal and written communication skills in English and Thai.
- Ability to use appropriate software and technology.
- Self-motivated with a high degree of integrity, honesty and ethics.
- Ability to work in a team and independently.
- Enthusiastic to learn and develop the skills required to advance.
- As a team member, you will have an opportunity to learn from our leadership group and work alongside experts in the field.
- Location: ศูนย์การค้า One Siam (สยามพารากอน, สยามเซ็นเตอร์, สยามดิสคัฟเวอรี่)
Skills:
Automation, Research, Assurance, English
Job type:
Full-time
Salary:
negotiable
- Develop efficient and high-quality test strategies, automation frameworks, and test cases to ensure software reliability, meeting project timelines and quality standards.
- Design and document clear and comprehensive test plans (including functional, integration, and performance testing) to facilitate effective communication and collaboration across teams.
- Optimize test execution efficiency, enhance automation coverage, and research emergi ...
- Conduct thorough test reviews and defect analysis, providing actionable feedback to improve software quality and drive continuous improvement within the team.
- Minimizes the risk of software defects and security vulnerabilities by maintaining robust testing methodologies, enhancing automation reliability, and improving bug detection accuracy.
- Drive end-to-end testing efforts from test planning to execution, ensuring smooth deployment and ongoing quality assurance in a production environment.
- At least 3 years of hands-on experience in the entire software testing lifecycle, including test planning, automation development, execution, and defect management in a production environment.
- Proficient in Playwright with TypeScript, with experience in developing and maintaining robust automation frameworks for UI and API testing. Ability to write clear, reusable, and maintainable test scripts.
- Strong knowledge of performance testing using K6, including designing load tests, stress tests, and endurance tests to measure system performance under high concurrency. Ability to analyze test results.
- Strong understanding of software testing methodologies, including functional, regression, integration, and exploratory testing.
- Excellent analytical and debugging skills, with the ability to investigate test failures, provide detailed bug reports, and collaborate with developers for resolution.
- Experience with event-driven architectures, ensuring thorough validation of message queues such as RabbitMQ.
- Knowledge of database validation, with a focus on data type consistency, data migrations, and schema evolution. Experience testing applications migrating between different databases (e.g., MongoDB to PostgreSQL or cloud-based solutions) is a plus.
- Familiarity with CI/CD pipelines, ensuring automated tests are seamlessly integrated into DevOps workflows for continuous testing and deployment.
- Possesses a positive attitude and participates in team-building and events.
- Comfortable presenting technical information and project updates to both technical and non-technical stakeholders.
- Be able to communicate in both Thai and English.
- Experience in AI-assisted test case generation, using prompt engineering techniques to reduce the time needed to create test automation scripts. Ability to verify AI-generated test cases for correctness, efficiency, and reliability before deployment.
- A strong understanding of the Fintech industry, particularly the business processes and workflows involved in trading operations.
- Experience with trading strategies (e.g., Auto-DCA, Rebalance).
- Familiarity with Agile development frameworks and Domain-Driven Design concepts.
- Remark: Given the nature of the mentioned position, where employees are involved with customer data and asset values, and/or the company, to comply with legal and regulatory standards established by the Securities and Exchange Commission, as well as to align with laws and overseeing agencies, the company requires a criminal background check as part of the post-interview process before joining the company. Your criminal history information will be retained for a period of 6 months from the start date..
- Important: Candidate Privacy Policy.
- สำคัญ:โปรดอ่านและทำความเข้าใจ: นโยบายความเป็นส่วนตัวด้านทรัพยากรบุคคล สำหรับผู้สมัครงาน และผู้สมัครเข้าฝึกงาน*.
- Don't forget to 'Like' and 'Follow' our social media channels so you won't miss any news from us. Click.
Skills:
Oracle, Java, SQL, English
Job type:
Full-time
Salary:
negotiable
- Be familiar with oracle technology such Oracle DB and related programming language such as java, pl/sql and linux script.
- Be advantage if you are familiar with Oracle Retail Suite or Oracle ERP.
- Create data flow/system flow/detail design to make the team clear in backlog and acceptance criteria.
- Create related documents as knowledge management for team and stakeholder.
- Co-ordinate with stakeholder (Business Team/Developer/QA/TPM/DEVOPS/Solution Architect) to solve any blocking issue in software development.
- Understand and adapt testing methodology and work with QA team to delivery software with Quality.
- Understand and adapt Agile methodology in software development.
- Understand and adapt CI/CD and be familiar with DEVOPS tools in software development.
- Work with L1/L2 support team as L3 to solve any production issues within SLA.
- Be senior team member and work with teammate to delivery software with quality. And be consultant for Jr. team member to solve some blocking problems.
- Bachelor s in Computer Science or related field.
- 6+ years in Developer/System Analyst with leadership experience.
- Proficient in Oracle Database, PL/SQL, Java, and Linux scripting. Experience with Oracle Retail Suite or Oracle ERP/Oracle Retail is a strong advantage.
- Skilled in creating data flows, system flows, and detailed design documentation to clarify requirements and acceptance criteria for the team.
- Strong coordination skills to work effectively with business teams, developers, QA, DevOps, and Solution Architects to resolve development issues and remove blockers.
- Experienced in Agile methodology.
- Ability to serve as L3 support for production issues, providing mentorship to junior team members and ensuring high-quality delivery.
- Good English proficiency.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
Skills:
Software Development, TypeScript, MongoDB, English
Job type:
Full-time
Salary:
negotiable
- Collaborate with the product team to ensure effective implementation of products.
- Perform full lifecycle software development, from design to testing.
- Take ownership and accountability for assigned tasks and projects.
- Design, code, document, and test software using accepted languages and frameworks.
- Follow established coding standards and processes.
- Commit to delivering high-quality products to end-users.
- Bachelor's degree in Computer Science, Engineering, or related field.
- At least 3-5+ years of relevant work experience in software development.
- Professional proficiency in written and verbal English.
- Demonstrated track record in building efficient and scalable architectural solutions.
- Strong expertise with Javascript, ECMAScript, and Typescript.
- Strong expertise with backend technologies and Node.js runtime.
- Strong expertise with React and frontend technologies.
- Preferred experience with NestJS and Next.js.
- Knowledge of design patterns, object-oriented programming (OOP), and functional programming concepts.
- Experience with backend performance optimization.
- Experience in handling, configuring, optimizing, and monitoring MongoDB, PostgreSQL, Elasticsearch.
- Experience in test automation techniques.
- Experience in REST and third-party API integrations.
- Familiarity with secure software development practices.
- Exposure to DevOps practices.
- Exposure to collaborating tools like GitHub, JIRA, Confluence.
- Compensation and Benefits.
- Competitive base salary.
- 10 days of annual leave in the first year, growing by 1 day per year to 15 days per year.
- 5 days of business leave.
- 10 - 20% Share of commission from owner and client referral.
- Health insurance on top of standard social security.
- BOI-sponsored visa & work permit for expats.
- Work Culture.
- International work environment and culture.
- 5 work days per week with 2 days work-from-home.
- Open communication that encourages feedback and idea-sharing.
- Innovative mindset that empowers creativity and new ideas.
- Established and defined career paths.
- Monthly Celebration & Parties.
- Company Values.
- Customers & Partners First.
- Integrity & Reliability.
- Team Collaboration & Innovation.
- Proactive Ownership.
- Performance Culture: Work hard, have fun, make history.
- Send your application now!.
- Please email your updated English CV to [email protected] using the format below.
- Email Subject: Senior Software Engineer - [Your Name].
- Please introduce yourself and answer the following questions in English.
- Relocation is mandatory. Are you comfortable relocating to Thailand?
- Why is this position interesting for you?
- What experience and skills will you bring to make you successful in this position?
- What is your current salary and your expected salary range?
- When can you start?.
Skills:
ETL, Python, TensorFlow
Job type:
Full-time
Salary:
negotiable
- Develop and deploy machine learning models for demand forecasting, customer segmentation, pricing optimization, and inventory management.
- Work with large-scale datasets and implement efficient feature engineering pipelines to enhance model performance.
- Use PySpark to process and analyze large datasets in a distributed computing environment.
- Collaborate with data engineers to build scalable data pipelines and ensure data quality.
- Implement MLOps best practices for model deployment, monitoring, and retraining in production.
- Design ETL workflows for preprocessing and transforming structured and unstructured data.
- Communicate findings and recommendations to business stakeholders in a clear and actionable manner.
- Stay up to date with the latest advancements in AI, machine learning, and data engineering.
- 5+ years of experience in data science, machine learning, or applied AI.
- Strong programming skills in Python (pandas, NumPy, scikit-learn, TensorFlow/PyTorch).
- Hands-on experience with PySpark for big data processing and analysis.
- Experience with SQL for querying large datasets efficiently.
- Familiarity with cloud platforms (AWS, GCP, Azure) and distributed computing frameworks.
- Knowledge of MLOps practices (model versioning, CI/CD for ML, monitoring, automation).
- Experience working with ETL workflows and data engineering pipelines.
- Strong understanding of statistical analysis, time-series forecasting, and clustering techniques.
- Excellent problem-solving and communication skills, with the ability to translate data insights into business value.
- Experience in the retail industry or working with e-commerce/consumer data.
- Familiarity with tools like Databricks, Airflow, MLflow, and Docker/Kubernetes.
- Experience with deep learning frameworks for NLP or computer vision.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
Skills:
Full Stack, Compliance, Java
Job type:
Full-time
Salary:
negotiable
- Develop and optimize fintech applications with modern technologies.
- Develop applications and APIs based on requirements, ensuring high code quality and efficiency.
- Manage tasks as directed by business management, aligning with project goals and deadlines.
- Create high-level and low-level system diagrams to effectively communicate design and functionality to stakeholders.
- Work on scalable microservices, cloud-based solutions, and APIs.
- Ensure best coding practices, security, and compliance in financial systems.
- Collaborate with cross-functional teams to drive innovation.
- Junior Engineers: 2+ years of experience, proficiency in Java/Python/C#/Go/Kotlin/JavaScript, and a passion for fintech..
- Senior Engineers/ Team Lead: 4+ years of experience, expertise in microservices, cloud platforms (AWS/GCP/Azure), databases (SQL/NoSQL), and DevOps..
- Strong problem-solving skills and a desire to work on impactful technical solutions..
- Remark: Given the nature of the mentioned position, where employees are involved with customer data and asset values, and/or the company, to comply with legal and regulatory standards established by the Securities and Exchange Commission, as well as to align with laws and overseeing agencies, the company requires a criminal background check as part of the post-interview process before joining the company. Your criminal history information will be retained for a period of 6 months from the start date..
Experience:
5 years required
Skills:
Python, ETL, Compliance
Job type:
Full-time
Salary:
negotiable
- Design and implement scalable, reliable, and efficient data pipelines for ingesting, processing, and storing large amounts of data from a variety of sources using cloud-based technologies, Python, and PySpark.
- Build and maintain data lakes, data warehouses, and other data storage and processing systems on the cloud.
- Write and maintain ETL/ELT jobs and data integration scripts to ensure smooth and accurate data flow.
- Implement data security and compliance measures to protect data privacy and ensure regulatory compliance.
- Collaborate with data scientists and analysts to understand their data needs and provide them with access to the required data.
- Stay up-to-date on the latest developments in cloud-based data engineering, particularly in the context of Azure, AWS and GCP, and proactively bring new ideas and technologies to the team.
- Monitor and optimize the performance of data pipelines and systems, identifying and resolving any issues or bottlenecks that may arise.
- Bachelor s or Master s degree in Computer Science, Data Science, or a related field.
- Minimum of 5 years of experience as a Data Engineer, with a strong focus on cloud-based data infrastructure.
- Proficient programming skills in Python, Java, or a similar language, with an emphasis on Python.
- Extensive experience with cloud-based data storage and processing technologies, particularly Azure, AWS and GCP.
- Familiarity with ETL/ELT tools and frameworks such as Apache Beam, Apache Spark, or Apache Flink.
- Knowledge of data modeling principles and experience working with SQL databases.
- Strong problem-solving skills and the ability to troubleshoot and resolve issues efficiently.
- Excellent communication and collaboration skills to work effectively with cross-functional teams.
- Location: True Digital Park, Bangkok (Hybrid working).
Experience:
5 years required
Skills:
Data Analysis, English
Job type:
Full-time
- Deep dive analysis and monitoring credit performance of new acquisition by analyzing first year default cases, trigger actions if vintage reaches critical levels. Coordinate all parties to provide mitigation steps.
- Revise underwriting standard to manage default rate within MOB12 to be acceptable ratio (products are profitable).
- Quality check when account is default in the first year. To investigate and analyze root cause of credit risk default and to recommend improving credit process.
- Summarize key findings and report it to the Team Head of Retail Credit Policy & Portfolio Management for further actions.
- Perform analysis to find opportunity on selective segment with acceptable risk level and prepare underwriting standard according to initiatives. Plan and perform A-B testing of different underwriting policies in combination with credit scoring, provide comparative study of the champion challenger approach. Closely monitor initiatives/ test programs and adjust underwriting rules if it requires.
- Plan and manage risk related change requests in the approval process, organize UAT and coordinate amongst the different stakeholders.
- Act as an expert of data interpretation, perform data investigation related to the portfolio management, help the analytics team by liaising with CRI or Datawarehouse in defining new or changed fields, data structuring and definition.
- Manage the development of the credit risk data self-service platform by drafting requirements and approving results.
- Continuously improving reporting ability by actively coming up with aspects and dimensions that are to be monitored.
- Regularly provide comprehensive and high-quality portfolio risk measurement, analysis and reporting on retail segment to Senior Managements and Committees within target dates or timelines to take the right strategic decisions on timely manners through deep-dive analysis.
- Provide recommendation according to deep dive analysis for loss mitigation.
- Support all portfolio management strategy & associated risk reward optimization initiatives across Acquisition, Account management & debt collection & recovery functions.
- Bachelor s degree or Higher in Finance & Banking, Economics, Business Administration, Engineering.
- At least total 5 years experienced in retail credit policy, portfolio monitoring, retail risk management, Finance.
- Strong computer skills required, proficiency in Excel, SAS enterprise guide, Power BI.
- Demonstrated aptitude for analytics and exceptional problem-solving skills.
- Ability to communicate complex ideas effectively - both verbally and in writing - in English.
Skills:
Data Analysis, Finance, SQL, English
Job type:
Full-time
Salary:
negotiable
- Data Analysis: Conduct in-depth analysis of retail and wholesale business data to address specific business questions and challenges.
- Insight Generation: Interpret results from dashboards and data analyses to develop actionable insights and strategic recommendations.
- Requirement Gathering: Identify business problems, gather requirements, and propose potential solutions, including leveraging AI to enhance business operations.
- ML Model creation: Create data analytic model including both deterministic and machine learning model.
- AI vendors coordination: Collaborate with external AI suppliers to align project objectives with technological capabilities.
- Cross-Departmental Collaboration: Work with various departments to develop and implement data-driven strategies that optimize business processes and decision-making.
- Communication: Act as a liaison between stakeholders and AI vendors, ensuring clear communication and understanding of project requirements.
- Data analytics and AI Strategy Design: Design and recommend how Business Intelligence (BI) and AI technologies can address business problems and provide further insights.
- Decision-making support: Present key findings from own analysis and strategic recommendations to business counterparts and senior management, focusing on project approaches and strategic planning.
- Master's degree in Finance, Business, Engineering, or a related field.
- Strong business acumen, with a deep understanding of retail and wholesale business.
- 3+ years of proven experience as a data analytic role (Retail or E-Commerce business is preferable).
- Hands-on Experience in SQL, data cloud platform (e.g., Databricks, Snowflake, GCP, or AWS), and high proficiency in Excel.
- Good Knowledge of Statistics.
- Experience in Python (Pandas, Numpy, SparkSQL), Data Visualisation (Tableau, PowerBI) is a plus.
- Excellent communication skills with the ability to convey complex findings to non-technical stakeholders.
- Fluent in Thai and English.
- Having a good attitude toward teamwork and willing to work hard.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
Experience:
3 years required
Skills:
English
Job type:
Full-time
Salary:
negotiable
- Responsible for planning preventive maintenance schedules for air condition & Fire protection systems.
- Responsible for coordinating and managing vendors and suppliers to preventive maintenance and payment plans.
- 2nd Level support to Data Center Operation (FOC), on site to solve Incident and Problem management.
- 2nd Level support to engineer team all site, Data Center (TT1, TT2, MTG, BNA).
- To create & update reports and documents to comply with ISO 20k, 22k, 27k, 50k & TCOS standards.
- Review PUE, cost saving energy and report.
- Measured Efficiency air system and record annual report.
- Responsible for implementation of Mechanical such as comfort air, precision air.
- Responsible for implementation of Fire suppression such as FM200, NOVEC, CO2, Fire Sprinkler, Fire Pump, Fire alarm, and Vesda.
- Working period office time 9:00 - 18:00 and able to standby on call or onsite to work on holiday.
- Bachelor degree of Engineering, Mechanical engineering or related field.
- At Least 3 years experience in maintenance air conditioning such as comfort air, precision air, chiller air cool, water cool, pump motor: implement and support for mechanics-air condition systems in buildings or Data Centers.
- At least 1 years experience in designing air conditioners such as comfort air, precision air, chiller air cool, water cool, pump motor: implement, and support for mechanics-air condition systems in building.
- Able to Air - Diagram and Psychrometric chart knowledge.
- Able to work as a team and work in and standby on call on holiday.
- Able to work overtime if required and a hotline arrives (Less than 1 hour on site from your home).
- Proficiency in English communication is beneficial.
- Work Location: TrueIDC - Bangna Site (KM26).
Skills:
ETL, Big Data, SQL
Job type:
Full-time
Salary:
negotiable
- Design and develop ETL solutions using data integration tools for interfacing between source application and the Enterprise Data Warehouse.
- Experience with Big Data or data warehouse.
- Analyze & translate functional specifications & change requests into technical specifications.
- Experience in SQL programming in one of these RDBMSs such as Oracle.
- Develops ETL technical specifications, designs, develops, tests, implements, and supports optimal data solutions.
- Develops Documents ETL data mappings, data dictionaries, processes, programs and solutions as per established standards for data governance.
- Design and create codes for all related data extraction, transformation and loading (ETL) into database under responsibilities.
- Creates, executes, and documents unit test plans for ETL and data integration processes and programs.
- Perform problem assessment, resolution and documentation in existing ETL packages, mapping and workflows in production.
- Performance tuning of the ETL process and SQL queries and recommend and implement ETL and query tuning techniques.
- Qualifications Bachelor s degree in Information Technology, Computer Science, Computer Engineering, or a related field.
- Experience with CI/CD tools (e.g., Jenkins, GitLab CI/CD) and automation frameworks.
- Proficiency in cloud platforms such as AWS, Azure, and associated services.
- Knowledge of IAC tools like Terraform or Azure ARM.
- Familiarity with monitoring and logging tools like Prometheus, Grafana, ELK stack, APM, etc.
- Good understanding of IT Operations.
- Strong problem-solving skills and the ability to troubleshoot complex issues.
- Excellent communication and teamwork skills to collaborate effectively across various teams.
- We're committed to bringing passion and customer focus to the business. If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us.
Skills:
Data Analysis, SQL, Problem Solving, English
Job type:
Full-time
Salary:
negotiable
- Working closely with business and technical domain experts to identify data requirements that are relevant for analytics and business intelligence.
- Implement data solutions and data comprehensiveness for data customers.
- Working closely with engineering to ensure data service solutions are ultimately delivered in a timely and cost effective manner.
- Retrieve and prepare data (automated if possible) to support business data analysis.
- Ensure adherence to the highest standards in data privacy protection and data governance.
- Bachelor s of Master s Degree in Computer Science, Computer Engineering, or related.
- Minimum of 1 Year with relational/non-relational database systems and good command in SQL.
- Ability to meet critical deadlines and prioritize multiple tasks in a fast-paced environment.
- Ability to work independently, have strong problem solving and organization skills, with a high initiative and a sense of accountability and ownership.
- Experience with cloud-based platforms such as AWS, Google Cloud platform or similar.
- Experience in data processing batch / real time / nearly realtime.
- Experience with data integration or ETL management tools such as AWS Glue, Databrick,or similar.
- Experience with web or software development with Java,Python or similar.
- Experience with Agile methodology is a plus.
- Good in communication and writing in English.
- Good interpersonal and communication skills.
Skills:
ETL, Big Data, SQL
Job type:
Full-time
Salary:
negotiable
- Design and develop ETL solutions using data integration tools for interfacing between source application and the Enterprise Data Warehouse.
- Experience with Big Data or data warehouse.
- Analyze & translate functional specifications & change requests into technical specifications.
- Experience in SQL programming in one of these RDBMSs such as Oracle.
- Develops ETL technical specifications, designs, develops, tests, implements, and supports optimal data solutions.
- Develops Documents ETL data mappings, data dictionaries, processes, programs and solutions as per established standards for data governance.
- Design and create codes for all related data extraction, transformation and loading (ETL) into database under responsibilities.
- Creates, executes, and documents unit test plans for ETL and data integration processes and programs.
- Perform problem assessment, resolution and documentation in existing ETL packages, mapping and workflows in production.
- Performance tuning of the ETL process and SQL queries and recommend and implement ETL and query tuning techniques.
- Qualifications Bachelor s degree in Information Technology, Computer Science, Computer Engineering, or a related field.
- Experience with CI/CD tools (e.g., Jenkins, GitLab CI/CD) and automation frameworks.
- Proficiency in cloud platforms such as AWS, Azure, and associated services.
- Knowledge of IAC tools like Terraform or Azure ARM.
- Familiarity with monitoring and logging tools like Prometheus, Grafana, ELK stack, APM, etc.
- Good understanding of IT Operations.
- Strong problem-solving skills and the ability to troubleshoot complex issues.
- Excellent communication and teamwork skills to collaborate effectively across various teams.
- We're committed to bringing passion and customer focus to the business. If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us.
Experience:
3 years required
Skills:
ETL, Apache, Python, English
Job type:
Full-time
Salary:
negotiable
- Analyze and organize raw data.
- Combine raw information from different sources.
- Designing and building data models to support business requirements.
- Developing and maintaining data ingestion and processing systems.
- Implementing data storage solutions. (databases and data lakes).
- Ensuring data consistency and accuracy through data validation and cleansing techniques.
- Conduct complex data analysis and report on results.
- Explore ways to enhance data quality and reliability.
- Working together with cross-functional teams to identify and address data-related issues.
- Writes unit/integration tests, contributes to engineering wiki, and documents work.
- Bachelor or Master Degree in Computer Science, Software Engineering, Computer Engineering ICT, IT or any related technical field.
- 2-5 years of experience as a data engineer or in a similar role.
- Experience with schema design and dimensional data modeling.
- Experience and knowledge in Python development.
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Experience building and optimizing data pipelines, architectures and data sets.
- Experience designing, building, and maintaining data processing systems.
- Experience with orchestration tools e.g. batch and real-time data processing.
- Experience with CICD pipeline data.
- Experience with big data.
- Familiarity with data integration and ETL tools.
- Strong problem-solving and analytical skills.
- Able to speak Thai fluently and basic command in English.
- 1
- 2
- 3
- 4