- No elements found. Consider changing the search query.
Experience:
4 years required
Skills:
Electrical Engineering, Mechanical Engineering, Excel, English
Job type:
Full-time
Salary:
negotiable
- Supervise contractors who perform servicing or preventive maintenance.
- Perform limited maintenance tasks to include: filter changes, battery system PMs, and Rack PDU & Rack ATS replacements.
- Perform root cause analysis for operational issues.
- Troubleshoot facility and rack level events.
- Ensure all personnel on-site follow safety protocols.
- Work on-call and a rotating schedule as needed.
- Take daily operational readings and provide metrics reporting to senior engineers.
- Perform basic support concepts such as ticketing systems, root cause analysis, and task prioritization.
- Diverse Experiences
- AWS values diverse experiences. Even if you do not meet all of the qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn t followed a traditional path, or includes alternative experiences, don t let it stop you from applying.
- Why AWS?
- Amazon Web Services (AWS) is the world s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating that s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses.
- Inclusive Team Culture
- AWS values curiosity and connection. Our employee-led and company-sponsored affinity groups promote inclusion and empower our people to take pride in what makes us unique. Our inclusion events foster stronger, more collaborative teams. Our continual innovation is fueled by the bold ideas, fresh perspectives, and passionate voices our teams bring to everything we do.
- Mentorship & Career Growth
- We re continuously raising our performance bar as we strive to become Earth s Best Employer. That s why you ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional.
- Work/Life Balance
- We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there s nothing we can t achieve in the cloud.
- BASIC QUALIFICATIONS.
- Associates Degree or Technical Degree in Electrical Engineering, Mechanical Engineering or relevant discipline.
- Fluent in English language, both written and spoken.
- 2+ working in a Data Center or Mission Critical Environment.
- PREFERRED QUALIFICATIONS.
- Bachelor s Degree in Electrical Engineering, Mechanical Engineering or relevant discipline.
- 4+ years of Data Center Operation Experience.
- Fundamental knowledge of network design and layout as well as low voltage (copper/ fiber) cabling.
- 2+ with Microsoft Excel, Word, and Outlook.
- Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you re applying in isn t listed, please contact your Recruiting Partner.
Skills:
ETL, Power BI, SQL
Job type:
Full-time
Salary:
negotiable
- āļĢāļąāļāļāļīāļāļāļāļāļāļēāļĢāļāļāļāđāļāļ Data model āļāļĩāđāļĢāļāļāļĢāļąāļāļāļēāļĢāđāļāđāļāļēāļāļĢāļ°āļĒāļ°āļĒāļēāļ§.
- āļāđāļ§āļĒāļ§āļēāļāđāļāļāļāļēāļĢāļŠāļĢāđāļēāļ Data mart āđāļŦāđāđāļŦāļĄāļēāļ°āļāļąāļ use case āļāđāļēāļāđ.
- āļāļģāļāļēāļāļĢāđāļ§āļĄāļāļąāļāļāļĩāļĄ ETL / Data Engineer āđāļāļ·āđāļāļāļąāļ schema, pipeline āđāļŦāđāļāļĢāļāļāļąāļāļāļ§āļēāļĄāļāđāļāļāļāļēāļĢ.
- āļŠāļ·āđāļāļŠāļēāļĢāđāļĨāļ°āđāļāļĨ requirement āļāļļāļĢāļāļīāļāļĄāļēāđāļāđāļ solution āđāļāļīāļāļāđāļāļĄāļđāļĨāđāļāđāļāļĩ.
- āļāļģāļŦāļāđāļēāļāļĩāđāļ§āļīāđāļāļĢāļēāļ°āļŦāđāļāđāļāļĄāļđāļĨāđāļāļīāļāļĨāļķāļāđāļŦāđāļāļąāļāļŦāļāđāļ§āļĒāļāļļāļĢāļāļīāļ.
- āļāļāļāđāļāļāđāļĨāļ°āļāļąāļāļāļē Dashboard / Report āļāļ Power BI āļāļĒāđāļēāļ advance.
- āļāļķāļāļāđāļāļĄāļđāļĨāļāļēāļ DWH āđāļāļĒāđāļāđ SQL āļāļĩāđāļāļąāļāļāđāļāļ āļĢāļ§āļĄāļāļķāļāļāļēāļĢāļāļģ data wrangling.
- āļĄāļĩāļāļĢāļ°āļŠāļāļāļēāļĢāļāđāļāļģāļāļēāļāđāļāļŠāļēāļĒāļāļēāļ Data (BA āļŦāļĢāļ·āļ SA) 5 āļāļĩ.
- āļĄāļĩāļāļ§āļēāļĄāđāļāļĩāđāļĒāļ§āļāļēāļāļ āļēāļĐāļē SQL āđāļāđāļāļāļĒāđāļēāļāļāļĩ.
- āļĄāļĩāļāļ§āļēāļĄāđāļāļĩāđāļĒāļ§āļāļēāļ Visualization Tool āđāļāđāļ Power BI āđāļĨāļ°āļŦāļĢāļ·āļ Data tools āļāļ·āđāļāđāļāļĩāđāđāļāļĩāđāļĒāļ§āļāļąāļ Data.
- āļĄāļĩāļāļ§āļēāļĄāļĢāļđāđāļāļ§āļēāļĄāđāļāđāļēāđāļāđāļ Data Warehouse Concept, ETL āđāļāđāļāļāļĒāđāļēāļāļāļĩ.
- āļāļąāļāļĐāļ°āļāļēāļĢāļāļāļāđāļāļ Data model, Data Mart.
- āļāļąāļāļĐāļ°āđāļāļāļēāļĢāļ§āļīāđāļāļĢāļēāļ°/āđāļāđāđāļāļāļąāļāļŦāļē.
- Contact Information:-.
- K. Sawarin Tel.
- Office of Human Capital.
- DIGITAL AND TECHNOLOGY SERVICES CO., LTD.
- F.Y.I Center 2525 Rama IV Rd, Khlong Tan, Khlong Toei, Bangkok 10110.
- MRT QSNCC Station Exit 1.
Skills:
Apache, Compliance, Automation
Job type:
Full-time
Salary:
negotiable
- Design, develop, and maintain robust and scalable data pipelines using tools such as Apache Airflow, PySpark, and cloud-native services (e.g., Azure Data Factory, Microsoft Fabric Pipelines)..
- Manage data ingestion from APIs, files, and databases into data lakes or data warehouses (e.g., Microsoft Fabric Lakehouse, Iceberg, DWS)..
- Ensure seamless data integration across on-premise, cloud, and hybrid environments..
- Implement data validation, standardization, and transformation to ensure high data quality..
- Apply data encryption, masking, and compliance controls to maintain security and privacy standards..
- AI & Intelligent AutomationCollaborate with Data Scientists to deploy ML models and integrate predictive insights into production pipelines (e.g., using Azure Machine Learning or Fabric Notebooks)..
- Support AI-powered automation and data insight generation through tools like Microsoft Co-pilot Studio or LLM-powered interfaces (chat-to-data)..
- Assist in building lightweight AI chatbots or agents that leverage existing datasets to enhance business efficiency..
- Qualifications & Skills3-5+ years of experience in Data Engineering or AI Engineering roles.
- Proficiency in Python, SQL, and big data frameworks (Apache Airflow, Spark, PySpark)..
- Experience with cloud platforms: Azure, Huawei Cloud, or AWS.
- Familiar with Microsoft Fabric services: OneLake, Lakehouse, Notebooks, Pipelines, and Real-Time Analytics..
- Hands-on with Microsoft Co-pilot Studio to design chatbots, agents, or LLM-based solutions..
- Experience in ML model deployment using Azure ML, ModelArts, or similar platforms.
- Understanding of vector databases (e.g., Qdrant), LLM orchestration (e.g., LangChain), and prompt engineering is a plus.
Job type:
Full-time
Salary:
negotiable
- Design and implement the methods for storing and retrieving the data and monitoring the data pipelines, starting from the ingestion of raw data sources, transforming, cleaning and storing, enrich the data for promptly using for both of structured and unstructured data, and working with the data lake and cloud which is the big data technology.
- Develop data pipeline automation using Azure technologies, Databricks and Data Factory
- Understand data, reports and dashboards requirements, develop data visualization using Power BI, Tableau by working across workstreams to support data requirements including reports and dashboards and collaborate with data scientists, data analyst, data governance team or business stakeholders on several projects
- Analyze and perform data profiling to understand data patterns following Data Quality and Data Management processes
- 3 years+ experience in big data technology, data engineering, data analytic application system development.
- Have an experience of unstructured data for business Intelligence or computer science would be advantage.
- Technical skill in SQL, UNIX and Shell Script, Python, R Programming, Spark, Hadoop programming.
Skills:
Compliance, Python, SQL
Job type:
Full-time
Salary:
negotiable
- Lead System Analyst/Senior Data Engineer assigns to work on IECC Project and Finance-Risk and Compliance Data initiatives to support solution design and data integration between upstream applications, downstream applications, and business users.
- To design, build, and operate reliable data pipelines across batch, near-real-time, and real-time workloads.
- To utilize multiple technologies (e.g. Python, SQL/Stored Procedures, ETL/ELT tools) to ingest, transform, and deliver governed, audit-ready data.
- To orchestrate and monitor jobs, implement data quality controls, and ensure security, lineage, and observability, while modernizing existing workflows with automation, testing, and performance tuning.
- Build and maintain ingestion, transformation, and delivery pipelines that produce governed, audit-ready datasets.
- Use Python, SQL/Stored Procedures, and ETL/ELT frameworks (or any relevant technologies) to implement scalable and reusable data pipeline components.
- Orchestrate and monitor workloads (e.g., DAGs/schedulers), ensuring reliability, idempotency and rerun ability.
- Enforce data quality (completeness, validity, accuracy, timeliness, uniqueness) and reconciliation checks.
- Ensure security and compliance: access control, PII handling, encryption, and audit logging.
- Design and manage workflow orchestration for reliable execution, monitoring, and failure recovery with Airflow/Control-M/ESP (DAGs, retries, backfills, idempotency).
- Collaborate with Architects/Stewards to apply a Shared Canonical Model (CDM) and data standards.
- Implement security controls (RBAC/ABAC), PII masking, encryption in-transit/at-rest, and auditable logs.
- Maintain runbooks, technical specifications (e.g. data mapping), and contribute to CI/CD (Git, artifacts, release notes).
- Monitor pipelines (SLIs/SLOs), diagnose incidents, and drive continuous performance and cost improvements.
- Promote data literacy and a data-driven culture through cross-functional collaboration..
- Apply now if you have these advantages.
- Bachelor's / Master degree in Computer Engineer, Computer Science, Information Technology, or related fields.
- At least 8-12 years as System Analyst / Data Engineer, 2-3 years in banking industry.
- Strong background in one or more: large-scale data processing, data infrastructure engineering, or data modeling.
- Solid grasp of CDC patterns, schema-drift control, robust error handling, and recovery/replay.
- Proven track record improving pipelines via automation, testing, and performance tuning.
- Exposure to cloud data platforms (AWS/Azure/GCP), Databricks/Spark Structured Streaming is a plus.
- Proficient in Python and SQL (or any relevant programming languages) and be able to apply solid software engineering practices (testing, version control, code reviews).
- Strong SQL (complex queries, optimization) and Python (DB-API/pandas or PySpark) comfortable with Unix shell.
- Experience with one or more: Talend, IBM DataStage, Airflow, Kafka, Spark, Trino/Presto.
- Curious, resilient, and critical thinker, open to feedback and continuous improvement.
- Financial services, risk and regulatory data experience (e.g., IECC, IFRS9, Basel, BOT, AML, Credit Risk, Compliance) is an advantage.
- Why join Krungsri?.
- As a part of MUFG (Mitsubishi UFJ Financial Group), we a truly a global bank with networks all over the world.
- We offer a striking work-life balance culture with hybrid work policies (2 days minimum in office per week).
- Unbelievable benefits such as attractive bonuses, employee loan with special rates and many more..
- Apply now before this role is close. **.
- FB: Krungsri Career(http://bit.ly/FacebookKrungsriCareer [link removed]).
- LINE: Krungsri Career (http://bit.ly/LineKrungsriCareer [link removed]).
- Talent Acquisition Department
- Bank of Ayudhya Public Company Limited
- 1222 Rama III Rd., Bangpongpang, Yannawa, Bangkok 10120.
- āļŦāļĄāļēāļĒāđāļŦāļāļļ āļāļāļēāļāļēāļĢāļĄāļĩāļāļ§āļēāļĄāļāļģāđāļāđāļāđāļĨāļ°āļāļ°āļĄāļĩāļāļąāđāļāļāļāļāļāļēāļĢāļāļĢāļ§āļāļŠāļāļāļāđāļāļĄāļđāļĨāļŠāđāļ§āļāļāļļāļāļāļĨāđāļāļĩāđāļĒāļ§āļāļąāļāļāļĢāļ°āļ§āļąāļāļīāļāļēāļāļāļēāļāļĢāļĢāļĄāļāļāļāļāļđāđāļŠāļĄāļąāļāļĢ āļāđāļāļāļāļĩāđāļāļđāđāļŠāļĄāļąāļāļĢāļāļ°āđāļāđāļĢāļąāļāļāļēāļĢāļāļīāļāļēāļĢāļāļēāđāļāđāļēāļĢāđāļ§āļĄāļāļēāļāļāļąāļāļāļāļēāļāļēāļĢāļāļĢāļļāļāļĻāļĢāļĩāļŊ.
- Remark: The bank needs to and will have a process for verifying personal information related to the criminal history of applicants before they are considered for employment with the bank..
- Applicants can read the Personal Data Protection Announcement of the Bank's Human Resources Function by typing the link from the image that stated below.
- EN (https://krungsri.com/b/privacynoticeen).
- āļāļđāđāļŠāļĄāļąāļāļĢāļŠāļēāļĄāļēāļĢāļāļāđāļēāļāļāļĢāļ°āļāļēāļĻāļāļēāļĢāļāļļāđāļĄāļāļĢāļāļāļāđāļāļĄāļđāļĨāļŠāđāļ§āļāļāļļāļāļāļĨāļŠāđāļ§āļāļāļēāļāļāļĢāļąāļāļĒāļēāļāļĢāļāļļāļāļāļĨāļāļāļāļāļāļēāļāļēāļĢāđāļāđāđāļāļĒāļāļēāļĢāļāļīāļĄāļāđāļĨāļīāļāļāđāļāļēāļāļĢāļđāļāļ āļēāļāļāļĩāđāļāļĢāļēāļāļāļāđāļēāļāļĨāđāļēāļ.
- āļ āļēāļĐāļēāđāļāļĒ (https://krungsri.com/b/privacynoticeth).
Skills:
Big Data, SQL, Hadoop
Job type:
Full-time
Salary:
negotiable
- Develop and maintain robust data pipelines to ingest, process, and transform raw data into formats suitable for LLM training.
- Conduct meeting with users to understand the data requirements and perform database design based on data understanding and requirements with consideration for performance.
- Maintain data dictionary, relationship and its interpretation.
- Analyze problem and find resolution, as well as work closely with administrators to monitor performance and advise any necessary infrastructure changes.
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Develop big data solutions for batch processing and near real-time streaming.
- Own end-to-end data ETL/ELT process framework from Data Source to Data warehouse.
- Select and integrate appropriate tools and frameworks required to provide requested capabilities.
- Design and develop BI solutions.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Keep abreast of new developments in the big data ecosystem and learn new technologies.
- Ability to effectively work independently and handle multiple priorities.
- Bachelor degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- 3+ year's experiences in Data Management or Data Engineer (Retail or E-Commerce business is preferrable).
- Expert experience in query language (SQL), Databrick SQL, PostgreSQL.
- Experience in Big Data Technologies like Hadoop, Apache Spark, Databrick.
- Experience in Python is a must.
- Experience in Generative AI is a must.
- Knowledge in machine/statistical learning, data mining is a plus.
- Strong analytical, problem solving, communication and interpersonal skills.
- Having good attitude toward team working and willing to work hard.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
Skills:
Research, ETL, Automation
Job type:
Full-time
Salary:
negotiable
- Lead the design and development of data architecture, ensuring scalability, security, and alignment with business strategy.
- Oversee the collection, transformation, and integration of data from multiple internal and external sources.
- Conduct advanced research and troubleshooting to address complex business and technical problems.
- Design, build, and optimize data pipelines and ETL processes to handle large-scale and real-time data.
- Implement automation solutions to minimize manual intervention and improve data efficiency.
- Provide technical leadership and mentorship to junior engineers, ensuring best practices in coding, testing, and deployment.
- Collaborate with cross-functional stakeholders including Data Scientists, Analysts, and Business Leaders to deliver actionable insights.
- Evaluate and recommend new tools, frameworks, and technologies to enhance data engineering capabilities.
- Job SpecificationBachelor s Degree in Information Technology, Computer Science, Statistics, Mathematics, Business, or a related field (Master s Degree is a plus).
- Minimum of 5 years experience in data engineering, with at least 2 years in a senior or lead role.
- Proven expertise in the data analytics lifecycle, including business problem framing, KPI/metrics design, exploratory analysis, and presenting data insights.
- Strong hands-on experience with cloud platforms (AWS, GCP, Azure) and advanced programming skills in Python, Java, PySpark.
- Solid knowledge of data processing, ETL frameworks, data warehousing, and messaging queue systems (e.g., Kafka).
- Demonstrated experience in designing highly scalable, resilient, and secure data systems.
Job type:
Full-time
Salary:
negotiable
- We are looking for a skilled Data Engineer to join our team and help build and maintain our data infrastructure. The ideal candidate will be responsible for designing, implementing, and managing our data processing systems and pipelines. You will work closely with data scientists, analysts, and other teams to ensure efficient and reliable data flow throughout the organization.
- Design, develop, and maintain scalable data pipelines for batch and real-time processing.
- Implement ETL processes to extract data from various sources and load it into data warehouses or data lakes.
- Optimize data storage and retrieval processes for improved performance.
- Collaborate with data scientists and analysts to understand their data requirements and provide appropriate solutions.
- Ensure data quality, consistency, and reliability across all data systems.
- Develop and maintain data models and schemas.
- Implement data security measures and access controls.
- Troubleshoot data-related issues and optimize system performance.
- Stay up-to-date with emerging technologies and industry trends in data engineering.
- Document data architectures, pipelines, and processes.
- Bachelor's degree in Computer Science, Engineering, or a related fields 2. 2-4 years of experience in data engineering or similar roles 3. Strong programming skills in Python, Java, or Scala 4. Proficiency in SQL and experience with relational databases (e.g., Databrick, PostgreSQL, MySQL) 5. Familiarity with cloud platforms (AWS, Azure, or Airflow) and their data services 6. Knowledge of data warehousing concepts and ETL best practices 7. Experience with version control systems (e.g., Git) 8. Understanding of data cleansing, data modeling and database design principles 9. Solid problem-solving skills and attention to detail 10. Good communication skills and ability to work with technical and non-technical team members.
- Experience with Azure data platform (ADF, Databrick) 2. Familiarity with data visualization tools (e.g., Tableau, Power BI) 3. Knowledge of stream processing technologies (e.g., Kafka, API, Google Big Query, MongoDB, SFTP sources) 4. Experience with containerization technologies (e.g., Docker).
- Experience to deal with large data and optimization skill in development.
- Understanding of machine learning concepts and data science workflows.
Skills:
Compliance, AutoCAD
Job type:
Full-time
Salary:
negotiable
- Assist in the design and development of mechanical systems including HVAC, Fire protection, and Hydraulic systems tailored for data center needs.
- Help plan and manage maintenance activities for mechanical systems, ensuring adherence to industry standards and operational efficiency.
- Maintain accurate records of mechanical system designs, maintenance activities, and compliance with safety regulations.
- Participate in site inspections to assess mechanical systems' condition and ensure compliance with design specifications.
- Assist in coordinating with third-party vendors for maintenance and upgrades, ensuring that all work meets established standards.
- Be available to respond to on-site incidents and assist senior engineers in troubleshooting mechanical failures.
- Engage in ongoing training and professional development opportunities to stay updated on the latest technologies in the data center industry.
- Job QualificationsBachelor s degree in mechanical engineering or a related field is required.
- more than 5 years of experience in mechanical engineering, preferably within a data center or critical environment.
- Basic understanding of HVAC systems, mechanical design principles, and relevant software tools (e.g., AutoCAD).
- Strong problem-solving abilities to identify issues and propose effective solutions.
- Good verbal and written communication skills for effective collaboration with team members and vendors.
- Ability to work well within a team environment while also being capable of taking initiative when necessary.
- Fluent in English both written and verbal (Minimum 750 TOEIC score).
- Goal-Oriented, Unity, Learning, Flexible.
Skills:
Tableau, Power BI, Excel
Job type:
Full-time
Salary:
negotiable
- Leads and improves analytical programs and presents the results.
- Analyzes the variable granular data of the business and regularly uses advanced models to deliver business insights across diverse business domains.
- Answers and anticipates critical business questions and opportunities and delivers insights to the business in ways that make significant impact.
- Demonstrates use of data visualization (Tableau, Power BI, Excel), and analytic tools (Python, R, SQL, KNIME/Alteryx) to grasp the business insights from mountains of data.
- Collaborates with multifunctional teams (Operations, Initiatives, and IT etc.) to improve key business priorities based on data-driven insights.
- Lead the roll-out and development of digital solution per business needs.
Job type:
Full-time
Salary:
negotiable
- The Senior Data Engineer position plays a key role in designing, developing, and managing cloud-based data platforms, as well as creating data structures for high-level data analysis, and works with business and technical teams to ensure that data management is appropriate and supports organizational goals.
- Responsible for the design, construction, and maintenance of optimal and scalable data pipeline architectures on cloud platforms (e.g., GCP, AWS, Azure).
- Oversee the development and management of complex ETL/ELT processes for data ingesti ...
- Author and optimize advanced, high-performance SQL queries for complex data transformation, aggregation, and analysis.
- Leverage the Python programming language for automation, scripting, and the development of data processing frameworks.
- Administer and optimize cloud-based data warehouse solutions and associated data lakes.
- Collaborate professionally with data scientists, analysts, and key business stakeholders to ascertain data requirements and deliver effective technical solutions.
- Provide mentorship to junior engineers and champion the adoption of data engineering best practices throughout the organization.
- Bachelor s degree or higher in Computer Science, Information Technology, Engineering, or a related field.
- At least 5 years of experience working in a data engineering or related position.
- Proficient in advanced SQL, including query optimization and performance tuning.
- Experienced in managing and designing architecture on at least one major cloud platform (Google Cloud Platform, AWS, or Azure).
- Skilled in using Python for data processing and advanced pipeline development.
- Experienced with tools and technologies for data ingestion, connectivity, and management.
- Deep understanding of data modeling principles, data warehousing methodologies, and modern data architecture.
- Excellent analytical and problem-solving skills.
- Communication and teamwork skills.
- Ability to plan and manage tasks effectively.
Experience:
1 year required
Skills:
SQL, Python, Automation, English
Job type:
Full-time
Salary:
negotiable
- Developing data pipeline and gather into internal tables using SQL, Inhouse-tools and Python.
- Collaborate with stakeholders to prepare, process, and validate data for business needs.
- Make recommendations on improvement, maintenance, or other factors to improve the database system.
- Develop reports, dashboards, and automation solutions using SPARK, SQL, Python, Excel, and in-house tools.
- Ensure data integrity by sanitizing, validating, and aligning numbers with accurate logic.
- Requirements: Master's or bachelor's degree in quantitative fields or a relevant field.
- 1-3 years of experience in Data Analytics, Data Engineering, or Business Intelligence.
- Experience in project & stakeholder management responsibilities.
- Strong SQL skills for data querying and Excel proficiency (Python is a plus).
- Strong English communication skills, both verbal and written.
- Detail-oriented and enjoy building from the ground up.
- Fresh Graduates are welcome.
Skills:
Excel, Problem Solving, English
Job type:
Full-time
Salary:
negotiable
- Translate complex, unstructured problems into clear strategic options.
- Build CEO/Chairman-ready materials to guide major business decisions.
- Synthesize insights from PMO, BI, and BD into integrated recommendations.
- Lead competitive benchmarking, growth modeling, and scenario analysis.
- Own the strategic logic behind major O2O decisions and initiatives.
- Act as a thought partner to senior leaders across the O2O organization.
- Bachelor s degree or higher in Business, Economics, Engineering, or related fields from a top-tier university.
- 2-4 years of experience in management consulting or corporate strategy.
- Preferred titles: Consulting Analyst / Associate Consultant / Junior Consultant / Business Analyst.
- Preferred firms: Bain & Company, Roland Berger, Kearney, EY-Parthenon, Strategy&, Oliver Wyman, etc.
- Strong business acumen and structured problem-solving skills.
- Excellent communication and slide development skills (PowerPoint is a must).
- Advanced Excel skills; experience with financial models or business cases is a plus.
- Fluent in English (verbal & written) and Thai.
- Hypothesis-driven thinking (consulting-grade problem solving).
- Strategic modeling and data interpretation.
- Storytelling via structured, executive-level presentations.
- Cross-functional influence and stakeholder alignment.
- Strong business intuition backed by analytical rigor.
- Proficiency in Excel, PowerPoint; exposure to eCom or retail is a plus.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
Experience:
3 years required
Skills:
ETL, Apache, Python, English
Job type:
Full-time
Salary:
negotiable
- Amaris Consulting is an independent technology consulting firm providing guidance and solutions to businesses. With more than 1000 clients across the globe, we have been rolling out solutions in major projects for over a decade - this is made possible by an international team of 7,600 people spread across 5 continents and more than 60 countries. Our solutions focus on four different Business Lines: Information System & Digital, Telecom, Life Sciences and Engineering. We're focused on building and nurturing a top talent community where all our team members can achieve their full pot ...
- Brief Call: Our process typically begins with a brief virtual/phone conversation to get to know you! The objective? Learn about you, understand your motivations, and make sure we have the right job for you!
- Interviews (the average number of interviews is 3 - the number may vary depending on the level of seniority required for the position). During the interviews, you will meet people from our team: your line manager of course, but also other people related to your future role. We will talk in depth about you, your experience, and skills, but also about the position and what will be expected of you. Of course, you will also get to know Amaris: our culture, our roots, our teams, and your career opportunities!
- Case study: Depending on the position, we may ask you to take a test. This could be a role play, a technical assessment, a problem-solving scenario, etc.
- As you know, every person is different and so is every role in a company. That is why we have to adapt accordingly, and the process may differ slightly at times. However, please know that we always put ourselves in the candidate's shoes to ensure they have the best possible experience.
- We look forward to meeting you!
- Design and optimize data pipelines and ETL/ELT workflows using Databricks and Apache Spark.
- Build and maintain data models and data lakes to support analytics and reporting.
- Develop reusable Python code for transformation, orchestration, and automation.
- Implement and tune complex PySpark and SQL queries for large-scale data processing.
- Collaborate with Data Scientists, Analysts, and Business Units to deliver scalable solutions.
- Ensure data quality, governance, and metadata management across projects.
- Manage Azure cloud services for data infrastructure and deployment.
- Support daily operations and performance of the Databricks platform.
- ABOUT YOU
- 3+ years of experiences in Data Engineering.
- Experience with Databricks, Unity Catalog, Apache Spark, and distributed data processing.
- Strong proficiency in Python, PySpark, SQL.
- Knowledge of data warehousing concepts, data modeling, and performance optimization.
- Experience with Azure cloud data platforms (e.g., Azure Synapse).
- Familiarity with CI/CD and version control (Git, BitBucket).
- Understanding of real-time data streaming and tools such as Qlik for replication.
- Academic background: Bachelor's or Master's in Computer Science, Engineering, or related field.
- Fluent English. Another language is a plus.
- You have excellent problem-solving skills and can work independently as well as in a team.
- WHY AMARIS?
- Global Diversity: Be part of an international team of 110+ nationalities, celebrating diverse perspectives and collaboration.
- Trust and Growth: With 70% of our leaders starting at entry-level, we're committed to nurturing talent and empowering you to reach new heights.
- Continuous Learning: Unlock your full potential with our internal Academy and over 250 training modules designed for your professional growth.
- Vibrant Culture: Enjoy a workplace where energy, fun, and camaraderie come together through regular afterworks, team-building events, and more.
- Meaningful Impact: Join us in making a difference through our CSR initiatives, including the WeCare Together program, and be part of something bigger.
- Equal opportunity
- Amaris Consulting is proud to be an equal opportunity workplace. We are committed to promoting diversity within the workforce and creating an inclusive working environment. For this purpose, we welcome applications from all qualified candidates regardless of gender, sexual orientation, race, ethnicity, beliefs, age, marital status, disability or other characteristics.
Experience:
4 years required
Skills:
Procurement, Electrical Engineering, Mechanical Engineering
Job type:
Full-time
Salary:
negotiable
- Establish performance benchmarks, conduct analysis, and prepare reports on all aspects of the critical facility operations and maintenance.
- Responsible for the on-site management of sub-contractors and vendors, ensuring that all work performed is in accordance with established practices and procedures.
- Manage relationship with third party Colocation providers and their facility staff.
- Responsible for the operation of and management of both routine and emergency services on a variety of critical systems such as: switchgear, generators, UPS systems, power distribution equipment, chillers, cooling towers, computer room air handlers, building monitoring systems, etc.
- Data Center capacity planning and reporting.
- Assist in the design and build out of new facilities.
- May assist in projects to increase current Facility efficiency.
- Work with IT managers and other business leaders to coordinate projects, manage capacity, and optimize plant safety, performance, reliability and efficiency.
- Deliver quality service and ensure all customer demands are met.
- Procurement for DCEO related expenditure.
- Responsible for asset and inventory management.
- Diverse Experiences
- AWS values diverse experiences. Even if you do not meet all of the qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn t followed a traditional path, or includes alternative experiences, don t let it stop you from applying.
- Why AWS
- Amazon Web Services (AWS) is the world s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating that s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses.
- Work/Life Balance
- We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why flexible work hours and arrangements are part of our culture. When we feel supported in the workplace and at home, there s nothing we can t achieve in the cloud.
- Inclusive Team Culture
- AWS values curiosity and connection. Our employee-led and company-sponsored affinity groups promote inclusion and empower our people to take pride in what makes us unique. Our inclusion events foster stronger, more collaborative teams. Our continual innovation is fueled by the bold ideas, fresh perspectives, and passionate voices our teams bring to everything we do.
- Mentorship and Career growth
- We re continuously raising our performance bar as we strive to become Earth s Best Employer. That s why you ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional.
- BASIC QUALIFICATIONS.
- 4+ years of relevant work experience in maintaining electrical, mechanical, HVAC, fire systems experience in a data center or critical space facility.
- Ability to participate in a 24 x 7 rotating shift roster.
- Understanding of the electrical and mechanical systems used in a data center environment, including but not limited to DRUPS, Transformers, Generators, Switchgear, UPS systems, ATS/STS units, PDUs, Chillers, AHUs and CRAC units.
- PREFERRED QUALIFICATIONS.
- Bachelor s Degree in either Electrical Engineering, HVAC, Mechanical Engineering or relevant technical (military/trade school) degree and relevant experience in a critical environment.
- Understanding of the electrical and mechanical systems used in a data center environment, including but not limited to DRUPS, Transformers, Generators, Switchgear, UPS systems, ATS/STS units, PDUs, Chillers, AHUs and CRAC units.
- Experience in management of vendors/contractors performing construction, maintenance and upgrading works in large-scale critical environment.
- Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you re applying in isn t listed, please contact your Recruiting Partner.
Skills:
ISO 27001, English
Job type:
Full-time
Salary:
negotiable
- āļĢāļąāļāļāļīāļāļāļāļāļāļēāļĢ Monitoring āļāļ§āļāļāļļāļĄāđāļĨāļ°āļāļąāļāļāļēāļĢāļĢāļ°āļāļāļāļ·āđāļāļāļēāļāđāļāļĩāđāļĒāļ§āļāļąāļ āđāļāļāđāļē āđāļĨāļ°āļĢāļ°āļāļāļāļĢāļąāļāļāļēāļāļēāļĻ āļĢāļ°āļāļāđāļāļĢāļ·āļāļāđāļēāļĒ āđāļāļ·āđāļāļŠāļāļąāļāļŠāļāļļāļāļāļēāļĢāļāļąāļāļāļēāļĢ.
- āļāļāļāļŠāļāļāļāļāļ§āļēāļĄāļāđāļāļāļāļēāļĢāļāļāļāļĨāļđāļāļāđāļē āđāļĨāļ°āļāļĢāļ°āļŠāļēāļāļāļēāļ āļāļēāļĢāļāļīāļāļāļąāđāļāđāļĨāļ°āļāļēāļĢāđāļāđāđāļāļāļąāļāļŦāļēāļĢāļ°āļāļāļāļāļāļāļđāđāļāļĢāļīāļāļēāļĢ (vendor) āđāļāļ·āđāļāđāļŦāđāļāļđāļāļāđāļāļāđāļĨāļ°āļŠāļĄāļāļđāļĢāļāđāļāļēāļĄāļŦāļĨāļąāļāļāļāļīāļāļąāļāļī.
- āļāļ§āļāļāļļāļĄāđāļĨāļ°āļāļĢāļ°āļŠāļēāļāļāļēāļāļāļēāļĢāļāļģāļĢāļļāļāļĢāļąāļāļĐāļēāđāļĨāļ°āļāļēāļĢāļāđāļāļĄāđāļāļĄ (Preventive Maintenance) āļĢāļ°āļāļāļāļ·āđāļāļāļēāļāļāđāļēāļāđ āđāļāļĢāļ·āđāļāļāļāļģāđāļāļīāļāđāļāļāđāļē Generator, āđāļāļĢāļ·āđāļāļāļŠāļģāļĢāļāļāđāļāļāđāļē UPS, āļĢāļ°āļāļāļāļđāđāđāļāļāđāļē, āļĢāļ°āļāļāļāļĢāļąāļāļāļēāļāļēāļĻ āđāļĨāļ°āļāļēāļĢāļāļīāļāļāļąāđāļāļāļļāļāļāļĢāļāđāļĢāļ°āļāļāđāļāļĢāļ·āļāļāđāļēāļĒ (Network) āđāļāđāļāļāđāļ.
- āđāļāđāļ 1st level support & troubleshooting āļāļāļāļĢāļ°āļāļ Facility āđāļ Data Center āđāļāđāļ āļĢāļ°āļāļ Network, āļĢāļ°āļāļāđāļāļāđāļē, āļĢāļ°āļāļāļāļĢāļąāļāļāļēāļāļēāļĻ āđāļāđāļāļāđāļ.
- āļāļąāļāļāļģāļāļĢāļ°āļāļ§āļāļāļēāļĢāļāļāļīāļāļąāļāļīāļāļēāļ āđāļĨāļ°āļāļđāđāļĄāļ·āļāļāļēāļĢāļāļģāļāļēāļāđāļāļāļēāļĢāļāļđāđāļĨāļĢāļ°āļāļāļāļ·āđāļāļāļēāļ āđāļāļĒāļāļīāļāļāļēāļĄāļĄāļēāļāļĢāļēāļāļēāļ ISO āļŦāļĢāļ·āļāļĄāļēāļāļĢāļāļēāļāļāļ·āđāļāļāļĩāđāđāļāļĩāđāļĒāļ§āļāđāļāļāļāļąāļāļāļēāļĢāļāļāļīāļāļąāļāļīāļāļēāļ (āđāļāđāļ ISO 20000 āļāđāļēāļāļāļĢāļīāļāļēāļĢ, ISO 27001 āļāđāļēāļāļāļ§āļēāļĄāļāļĨāļāļāļ āļąāļĒ,ISO 50001 āļāđāļēāļāļāļĢāļīāļŦāļēāļĢāļāļĨāļąāļāļāļēāļ āđāļĨāļ°āļāļ·āđāļāđ āđāļāđāļ ISO22301, PCIDSS, TCOS) āļĢāļ§āļĄāļāļąāđāļāļĢāļđāļāđāļāļāđāļāļāļąāļāļāļķāļ, āļĢāļēāļĒāļāļēāļāļāđāļēāļ āđ.
- āļŠāļĢāļļāļāđāļĨāļ°āļĢāļēāļĒāļāļēāļāļāļĨāļŠāļģāļŦāļĢāļąāļāļāļąāļāļŦāļēāļ§āļīāļāļĪāļāļīāđāļ āđ āļāđāļāļŦāļąāļ§āļŦāļāđāļēāļāļĩāļĄ āļĢāļ§āļĄāļāļąāđāļ āļāļēāļĢāļāļąāļāļāļģāļĢāļēāļĒāļāļēāļāļŠāļāļīāļāļī,āļĢāļēāļĒāļāļēāļāļ§āļīāđāļāļĢāļēāļ°āļŦāđāđāļāļāļĢāļēāļĒāļ§āļąāļ, āļĢāļēāļĒāđāļāļ·āļāļ āļĢāļēāļĒāđāļāļĢāļĄāļēāļŠ āļāđāļ§āļĒ.
- Bachelor s degree in electrical power, mechanic or related fields.
- Thai nationality, Male, Age 20 - 25 years old.
- Have basic technical knowledge in Data Center facilities (Electrical/Mechanical).
- Able to work under pressure.
- Able to work with a team.
- Fair communication in English.
Experience:
2 years required
Skills:
Research, Python, SQL
Job type:
Full-time
Salary:
negotiable
- Develop machine learning models such as credit model, income estimation model and fraud model.
- Research on cutting-edge technology to enhance existing model performance.
- Explore and conduct feature engineering on existing data set (telco data, retail store data, loan approval data).
- Develop sentimental analysis model in order to support collection strategy.
- Bachelor Degree in Computer Science, Operations Research, Engineering, or related quantitative discipline.
- 2-5 years of experiences in programming languages such as Python, SQL or Scala.
- 5+ years of hands-on experience in building & implementing AI/ML solutions for senior role.
- Experience with python libraries - Numpy, scikit-learn, OpenCV, Tensorflow, Pytorch, Flask, Django.
- Experience with source version control (Git, Bitbucket).
- Proven knowledge on Rest API, Docker, Google Big Query, VScode.
- Strong analytical skills and data-driven thinking.
- Strong understanding of quantitative analysis methods in relation to financial institutions.
- Ability to clearly communicate modeling results to a wide range of audiences.
- Nice to have.
- Experience in image processing or natural language processing (NLP).
- Solid understanding in collection model.
- Familiar with MLOps concepts.
Experience:
5 years required
Skills:
AutoCAD, Visio, English
Job type:
Full-time
Salary:
negotiable
- Responsible for planning preventive maintenance schedules for the electrical system.
- Responsible for coordinating and managing vendors and suppliers to preventive maintenance and payment plans.
- 2nd Level support to Data Center Operation (FOC), on site to solve Incident and Problem management.
- 2nd Level support to engineer team all site, Data Center (TT1, TT2, MTG, BNA).
- To create & update reports and documents to comply with ISO 20k, 22k, 27k, 50k & TCOS standards.
- Review PUE, cost saving energy and report.
- Measured Efficiency air system and record annual report.
- Responsible for implementing Electrical such as MU, TR, MDB, GEN, UPS, RECT, BATT, ATS.
- Bachelor degree of Engineering, Electrical engineering or related field.
- More than 5 years of experience in maintenance of electrical systems such as RMU, TR, MDB, GEN, UPS, RECT, BATT, ATS: implement and support electrical systems in buildings or Data Centers.
- At least 1 years experience in designing electrical systems (such as RMU, TR, MDB, GEN, UPS, RECT, BATT, ATS). implement, and support for electrical systems in building.
- Able to use the program AutoCAD, Visio.
- Able to work as a team and work in and standby on call on holiday.
- Able to work overtime if required and a hotline arrives (Less than 1 hour on site from your home).
- Proficiency in English communication is beneficial for both reading and writing.
- Work Location: TrueIDC - Bangna Site (KM26).
Skills:
Data Analysis, SQL, Problem Solving, English
Job type:
Full-time
Salary:
negotiable
- Working closely with business and technical domain experts to identify data requirements that are relevant for analytics and business intelligence.
- Implement data solutions and data comprehensiveness for data customers.
- Working closely with engineering to ensure data service solutions are ultimately delivered in a timely and cost effective manner.
- Retrieve and prepare data (automated if possible) to support business data analysis.
- Ensure adherence to the highest standards in data privacy protection and data governance.
- Bachelor s of Master s Degree in Computer Science, Computer Engineering, or related.
- Minimum of 1 Year with relational/non-relational database systems and good command in SQL.
- Ability to meet critical deadlines and prioritize multiple tasks in a fast-paced environment.
- Ability to work independently, have strong problem solving and organization skills, with a high initiative and a sense of accountability and ownership.
- Experience with cloud-based platforms such as AWS, Google Cloud platform or similar.
- Experience in data processing batch / real time / nearly realtime.
- Experience with data integration or ETL management tools such as AWS Glue, Databrick,or similar.
- Experience with web or software development with Java,Python or similar.
- Experience with Agile methodology is a plus.
- Good in communication and writing in English.
- Good interpersonal and communication skills.
Experience:
5 years required
Skills:
Python, ETL, Java
Job type:
Full-time
Salary:
negotiable
- Design and implement scalable, reliable, and efficient data pipelines for ingesting, processing, and storing large amounts of data from a variety of sources using cloud-based technologies, Python, and PySpark.
- Build and maintain data lakes, data warehouses, and other data storage and processing systems on the cloud.
- Write and maintain ETL/ELT jobs and data integration scripts to ensure smooth and accurate data flow.
- Implement data security and compliance measures to protect data privacy and ensure regulatory compliance.
- Collaborate with data scientists and analysts to understand their data needs and provide them with access to the required data.
- Stay up-to-date on the latest developments in cloud-based data engineering, particularly in the context of Azure, AWS and GCP, and proactively bring new ideas and technologies to the team.
- Monitor and optimize the performance of data pipelines and systems, identifying and resolving any issues or bottlenecks that may arise.
- Bachelor s or Master s degree in Computer Science, Data Science, or a related field.
- Minimum of 5 years of experience as a Data Engineer, with a strong focus on cloud-based data infrastructure.
- Proficient programming skills in Python, Java, or a similar language, with an emphasis on Python.
- Extensive experience with cloud-based data storage and processing technologies, particularly Azure, AWS and GCP.
- Familiarity with ETL/ELT tools and frameworks such as Apache Beam, Apache Spark, or Apache Flink.
- Knowledge of data modeling principles and experience working with SQL databases.
- Strong problem-solving skills and the ability to troubleshoot and resolve issues efficiently.
- Excellent communication and collaboration skills to work effectively with cross-functional teams.
- Location: True Digital Park, Bangkok (Hybrid working).
- 1
- 2
