- No elements found. Consider changing the search query.
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Design and implement the methods for storing and retrieving the data and monitoring the data pipelines, starting from the ingestion of raw data sources, transforming, cleaning and storing, enrich the data for promptly using for both of structured and unstructured data, and working with the data lake and cloud which is the big data technology.
- Develop data pipeline automation using Azure technologies, Databricks and Data Factory
- Understand data, reports and dashboards requirements, develop data visualization using Power BI, Tableau by working across workstreams to support data requirements including reports and dashboards and collaborate with data scientists, data analyst, data governance team or business stakeholders on several projects
- Analyze and perform data profiling to understand data patterns following Data Quality and Data Management processes
- 3 years+ experience in big data technology, data engineering, data analytic application system development.
- Have an experience of unstructured data for business Intelligence or computer science would be advantage.
- Technical skill in SQL, UNIX and Shell Script, Python, R Programming, Spark, Hadoop programming.
ทักษะ:
Automation, ETL, Big Data
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Collaborate with cross-functional teams to understand complex business requirements and translate them into proper data solutions.
- Lead discussions with stakeholders (e.g., business domain experts, data analysts, data scientists, data governance) to design and deliver data solutions and pipelines.
- Designs and maintains data models that support business requirements and analytical needs.
- Builds and optimizes data pipelines for efficient data ingestion, transformation, and delivery (for both on premise platform and/or on cloud platform).
- Ensures proper data governance, quality, and lifecycle management across platforms.
- Proactively identify opportunities for data process improvements and automation.
- Mentor junior data engineers, promote team knowledge sharing, and review code to ensure adherence to coding standards, documentation, testing, and version control.
- Apply now if you have these advantages.
- Bachelor's / Master degree in Computer Engineer, Computer Science, Information Technology, or related fields.
- 5-7 years of experience in database design, data integration, data pipeline development, and/or ETL-ELT process.
- Strong understanding of data architecture, data modeling and data solutions.
- Experience in Banking & Financial industry is an advantage.
- Proven ability to work independently and lead technical solution within a team.
- Database (RDBMS), data warehouse, DataMart, Big data technology is required (at least one).
- Data programming language ( ie. SQL, Python, PySpark) is required.
- ETL/ELT, Data pipeline.
- Experience in RDBMS, big data technology (Hadoop), AWS cloud is an advantage.
- Why join Krungsri?.
- As a part of MUFG (Mitsubishi UFJ Financial Group), we a truly a global bank with networks all over the world.
- We offer a striking work-life balance culture with hybrid work policies (2 days in office per week).
- Unbelievable benefits such as attractive bonuses, employee loan with special rates and many more..
- Apply now before this role is close. **.
- FB: Krungsri Career(http://bit.ly/FacebookKrungsriCareer [link removed]).
- LINE: Krungsri Career (http://bit.ly/LineKrungsriCareer [link removed]).
- Talent Acquisition Department
- Bank of Ayudhya Public Company Limited
- 1222 Rama III Rd., Bangpongpang, Yannawa, Bangkok 10120.
- หมายเหตุ ธนาคารมีความจำเป็นและจะมีขั้นตอนการตรวจสอบข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของผู้สมัคร ก่อนที่ผู้สมัครจะได้รับการพิจารณาเข้าร่วมงานกับธนาคารกรุงศรีฯ.
- Remark: The bank needs to and will have a process for verifying personal information related to the criminal history of applicants before they are considered for employment with the bank..
- Applicants can read the Personal Data Protection Announcement of the Bank's Human Resources Function by typing the link from the image that stated below.
- EN (https://krungsri.com/b/privacynoticeen).
- ผู้สมัครสามารถอ่านประกาศการคุ้มครองข้อมูลส่วนบุคคลส่วนงานทรัพยากรบุคคลของธนาคารได้โดยการพิมพ์ลิงค์จากรูปภาพที่ปรากฎด้านล่าง.
- ภาษาไทย (https://krungsri.com/b/privacynoticeth).
ทักษะ:
Tableau, Power BI, Excel
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Leads and improves analytical programs and presents the results.
- Analyzes the variable granular data of the business and regularly uses advanced models to deliver business insights across diverse business domains.
- Answers and anticipates critical business questions and opportunities and delivers insights to the business in ways that make significant impact.
- Demonstrates use of data visualization (Tableau, Power BI, Excel), and analytic tools (Python, R, SQL, KNIME/Alteryx) to grasp the business insights from mountains of data.
- Collaborates with multifunctional teams (Operations, Initiatives, and IT etc.) to improve key business priorities based on data-driven insights.
- Lead the roll-out and development of digital solution per business needs.
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- We are looking for a skilled Data Engineer to join our team and help build and maintain our data infrastructure. The ideal candidate will be responsible for designing, implementing, and managing our data processing systems and pipelines. You will work closely with data scientists, analysts, and other teams to ensure efficient and reliable data flow throughout the organization.
- Design, develop, and maintain scalable data pipelines for batch and real-time processing.
- Implement ETL processes to extract data from various sources and load it into data warehouses or data lakes.
- Optimize data storage and retrieval processes for improved performance.
- Collaborate with data scientists and analysts to understand their data requirements and provide appropriate solutions.
- Ensure data quality, consistency, and reliability across all data systems.
- Develop and maintain data models and schemas.
- Implement data security measures and access controls.
- Troubleshoot data-related issues and optimize system performance.
- Stay up-to-date with emerging technologies and industry trends in data engineering.
- Document data architectures, pipelines, and processes.
- Bachelor's degree in Computer Science, Engineering, or a related fields 2. 2-4 years of experience in data engineering or similar roles 3. Strong programming skills in Python, Java, or Scala 4. Proficiency in SQL and experience with relational databases (e.g., Databrick, PostgreSQL, MySQL) 5. Familiarity with cloud platforms (AWS, Azure, or Airflow) and their data services 6. Knowledge of data warehousing concepts and ETL best practices 7. Experience with version control systems (e.g., Git) 8. Understanding of data cleansing, data modeling and database design principles 9. Solid problem-solving skills and attention to detail 10. Good communication skills and ability to work with technical and non-technical team members.
- Experience with Azure data platform (ADF, Databrick) 2. Familiarity with data visualization tools (e.g., Tableau, Power BI) 3. Knowledge of stream processing technologies (e.g., Kafka, API, Google Big Query, MongoDB, SFTP sources) 4. Experience with containerization technologies (e.g., Docker).
- Experience to deal with large data and optimization skill in development.
- Understanding of machine learning concepts and data science workflows.
ทักษะ:
Research, ETL, Automation
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Lead the design and development of data architecture, ensuring scalability, security, and alignment with business strategy.
- Oversee the collection, transformation, and integration of data from multiple internal and external sources.
- Conduct advanced research and troubleshooting to address complex business and technical problems.
- Design, build, and optimize data pipelines and ETL processes to handle large-scale and real-time data.
- Implement automation solutions to minimize manual intervention and improve data efficiency.
- Provide technical leadership and mentorship to junior engineers, ensuring best practices in coding, testing, and deployment.
- Collaborate with cross-functional stakeholders including Data Scientists, Analysts, and Business Leaders to deliver actionable insights.
- Evaluate and recommend new tools, frameworks, and technologies to enhance data engineering capabilities.
- Job SpecificationBachelor s Degree in Information Technology, Computer Science, Statistics, Mathematics, Business, or a related field (Master s Degree is a plus).
- Minimum of 5 years experience in data engineering, with at least 2 years in a senior or lead role.
- Proven expertise in the data analytics lifecycle, including business problem framing, KPI/metrics design, exploratory analysis, and presenting data insights.
- Strong hands-on experience with cloud platforms (AWS, GCP, Azure) and advanced programming skills in Python, Java, PySpark.
- Solid knowledge of data processing, ETL frameworks, data warehousing, and messaging queue systems (e.g., Kafka).
- Demonstrated experience in designing highly scalable, resilient, and secure data systems.
ทักษะ:
Compliance, Python, SQL
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Lead System Analyst/Senior Data Engineer assigns to work on IECC Project and Finance-Risk and Compliance Data initiatives to support solution design and data integration between upstream applications, downstream applications, and business users.
- To design, build, and operate reliable data pipelines across batch, near-real-time, and real-time workloads.
- To utilize multiple technologies (e.g. Python, SQL/Stored Procedures, ETL/ELT tools) to ingest, transform, and deliver governed, audit-ready data.
- To orchestrate and monitor jobs, implement data quality controls, and ensure security, lineage, and observability, while modernizing existing workflows with automation, testing, and performance tuning.
- Build and maintain ingestion, transformation, and delivery pipelines that produce governed, audit-ready datasets.
- Use Python, SQL/Stored Procedures, and ETL/ELT frameworks (or any relevant technologies) to implement scalable and reusable data pipeline components.
- Orchestrate and monitor workloads (e.g., DAGs/schedulers), ensuring reliability, idempotency and rerun ability.
- Enforce data quality (completeness, validity, accuracy, timeliness, uniqueness) and reconciliation checks.
- Ensure security and compliance: access control, PII handling, encryption, and audit logging.
- Design and manage workflow orchestration for reliable execution, monitoring, and failure recovery with Airflow/Control-M/ESP (DAGs, retries, backfills, idempotency).
- Collaborate with Architects/Stewards to apply a Shared Canonical Model (CDM) and data standards.
- Implement security controls (RBAC/ABAC), PII masking, encryption in-transit/at-rest, and auditable logs.
- Maintain runbooks, technical specifications (e.g. data mapping), and contribute to CI/CD (Git, artifacts, release notes).
- Monitor pipelines (SLIs/SLOs), diagnose incidents, and drive continuous performance and cost improvements.
- Promote data literacy and a data-driven culture through cross-functional collaboration..
- Apply now if you have these advantages.
- Bachelor's / Master degree in Computer Engineer, Computer Science, Information Technology, or related fields.
- At least 8-12 years as System Analyst / Data Engineer, 2-3 years in banking industry.
- Strong background in one or more: large-scale data processing, data infrastructure engineering, or data modeling.
- Solid grasp of CDC patterns, schema-drift control, robust error handling, and recovery/replay.
- Proven track record improving pipelines via automation, testing, and performance tuning.
- Exposure to cloud data platforms (AWS/Azure/GCP), Databricks/Spark Structured Streaming is a plus.
- Proficient in Python and SQL (or any relevant programming languages) and be able to apply solid software engineering practices (testing, version control, code reviews).
- Strong SQL (complex queries, optimization) and Python (DB-API/pandas or PySpark) comfortable with Unix shell.
- Experience with one or more: Talend, IBM DataStage, Airflow, Kafka, Spark, Trino/Presto.
- Curious, resilient, and critical thinker, open to feedback and continuous improvement.
- Financial services, risk and regulatory data experience (e.g., IECC, IFRS9, Basel, BOT, AML, Credit Risk, Compliance) is an advantage.
- Why join Krungsri?.
- As a part of MUFG (Mitsubishi UFJ Financial Group), we a truly a global bank with networks all over the world.
- We offer a striking work-life balance culture with hybrid work policies (2 days minimum in office per week).
- Unbelievable benefits such as attractive bonuses, employee loan with special rates and many more..
- Apply now before this role is close. **.
- FB: Krungsri Career(http://bit.ly/FacebookKrungsriCareer [link removed]).
- LINE: Krungsri Career (http://bit.ly/LineKrungsriCareer [link removed]).
- Talent Acquisition Department
- Bank of Ayudhya Public Company Limited
- 1222 Rama III Rd., Bangpongpang, Yannawa, Bangkok 10120.
- หมายเหตุ ธนาคารมีความจำเป็นและจะมีขั้นตอนการตรวจสอบข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของผู้สมัคร ก่อนที่ผู้สมัครจะได้รับการพิจารณาเข้าร่วมงานกับธนาคารกรุงศรีฯ.
- Remark: The bank needs to and will have a process for verifying personal information related to the criminal history of applicants before they are considered for employment with the bank..
- Applicants can read the Personal Data Protection Announcement of the Bank's Human Resources Function by typing the link from the image that stated below.
- EN (https://krungsri.com/b/privacynoticeen).
- ผู้สมัครสามารถอ่านประกาศการคุ้มครองข้อมูลส่วนบุคคลส่วนงานทรัพยากรบุคคลของธนาคารได้โดยการพิมพ์ลิงค์จากรูปภาพที่ปรากฎด้านล่าง.
- ภาษาไทย (https://krungsri.com/b/privacynoticeth).
ทักษะ:
Python, SQL, Java
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Data Pipeline Development: Design, implement, and maintain data analytics pipelines and processing systems.
- Data Modeling: Apply data modeling techniques and integration patterns to ensure data consistency and reliability.
- Data Transformation: Write data transformation jobs through code to optimize data processing.
- Data Management: Perform data management through data quality tests, monitoring, cataloging, and governance.
- LLM Integration: Design and integrate LLMs into existing applications, ensuring smooth functionality and performance.
- Model Development and Fine-Tuning: Develop and fine-tune LLMs to meet specific business needs, optimizing for accuracy and efficiency.
- Performance Optimization: Continuously optimize LLM performance for speed, scalability, and reliability.
- Infrastructure Knowledge: Possess knowledge of the data and AI infrastructure ecosystem.
- Collaboration: Collaborate with cross-functional teams to identify opportunities to leverage data to drive business outcomes.
- Continuous Learning: Demonstrate a willingness to learn and find solutions to complex problems.
- Education: Bachelor's or Master's degree in Computer Science, AI, Engineering, or a related field.
- Experience: At least 2 years of experience in data engineering and at least 3 years as data scientist.
- Technical Skills: Proficiency in Python, SQL, Java, experience with LLM frameworks (e.g., LangChain), and familiarity with cloud computing platforms. Additional, visualization tools i.e Power BI, Tableau, Looker, Qlik.
- Cloud Computing: Familiarity with cloud computing platforms, such as GCP, AWS, or Databricks.
- Problem-Solving: Strong problem-solving skills with the ability to work independently and collaboratively.
- Desirable.
- System Design: Knowledge of system design and platform thinking to build sustainable solutions.
- Big Data Experience: Practical experience with modern and traditional Big Data stacks (e.g., BigQuery, Spark, Databricks, duckDB, Impala, Hive).
- Data Warehouse Solutions: Experience working with data warehouse solutions, ELT tools, and techniques (e.g., Airflow, dbt, SAS, Nifi).
- API Development: Experience with API design to facilitate integration of LLMs with other systems.
- Prompt Engineering: Skills in designing sequential tasks for LLMs to achieve efficient and accurate outputs.
- Visualization Solution: Skills in design and develop dashboard for analytic & insight.
- Agile Methodologies: Experience with agile software delivery and CI/CD processes.
- Location: BTS Ekkamai
- Working Day: Mon-Fri (WFA Every Friday).
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- The Senior Data Engineer position plays a key role in designing, developing, and managing cloud-based data platforms, as well as creating data structures for high-level data analysis, and works with business and technical teams to ensure that data management is appropriate and supports organizational goals.
- Responsible for the design, construction, and maintenance of optimal and scalable data pipeline architectures on cloud platforms (e.g., GCP, AWS, Azure).
- Oversee the development and management of complex ETL/ELT processes for data ingesti ...
- Author and optimize advanced, high-performance SQL queries for complex data transformation, aggregation, and analysis.
- Leverage the Python programming language for automation, scripting, and the development of data processing frameworks.
- Administer and optimize cloud-based data warehouse solutions and associated data lakes.
- Collaborate professionally with data scientists, analysts, and key business stakeholders to ascertain data requirements and deliver effective technical solutions.
- Provide mentorship to junior engineers and champion the adoption of data engineering best practices throughout the organization.
- Bachelor s degree or higher in Computer Science, Information Technology, Engineering, or a related field.
- At least 5 years of experience working in a data engineering or related position.
- Proficient in advanced SQL, including query optimization and performance tuning.
- Experienced in managing and designing architecture on at least one major cloud platform (Google Cloud Platform, AWS, or Azure).
- Skilled in using Python for data processing and advanced pipeline development.
- Experienced with tools and technologies for data ingestion, connectivity, and management.
- Deep understanding of data modeling principles, data warehousing methodologies, and modern data architecture.
- Excellent analytical and problem-solving skills.
- Communication and teamwork skills.
- Ability to plan and manage tasks effectively.
ประสบการณ์:
3 ปีขึ้นไป
ทักษะ:
ETL, Apache, Python, English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Amaris Consulting is an independent technology consulting firm providing guidance and solutions to businesses. With more than 1000 clients across the globe, we have been rolling out solutions in major projects for over a decade - this is made possible by an international team of 7,600 people spread across 5 continents and more than 60 countries. Our solutions focus on four different Business Lines: Information System & Digital, Telecom, Life Sciences and Engineering. We're focused on building and nurturing a top talent community where all our team members can achieve their full pot ...
- Brief Call: Our process typically begins with a brief virtual/phone conversation to get to know you! The objective? Learn about you, understand your motivations, and make sure we have the right job for you!
- Interviews (the average number of interviews is 3 - the number may vary depending on the level of seniority required for the position). During the interviews, you will meet people from our team: your line manager of course, but also other people related to your future role. We will talk in depth about you, your experience, and skills, but also about the position and what will be expected of you. Of course, you will also get to know Amaris: our culture, our roots, our teams, and your career opportunities!
- Case study: Depending on the position, we may ask you to take a test. This could be a role play, a technical assessment, a problem-solving scenario, etc.
- As you know, every person is different and so is every role in a company. That is why we have to adapt accordingly, and the process may differ slightly at times. However, please know that we always put ourselves in the candidate's shoes to ensure they have the best possible experience.
- We look forward to meeting you!
- Design and optimize data pipelines and ETL/ELT workflows using Databricks and Apache Spark.
- Build and maintain data models and data lakes to support analytics and reporting.
- Develop reusable Python code for transformation, orchestration, and automation.
- Implement and tune complex PySpark and SQL queries for large-scale data processing.
- Collaborate with Data Scientists, Analysts, and Business Units to deliver scalable solutions.
- Ensure data quality, governance, and metadata management across projects.
- Manage Azure cloud services for data infrastructure and deployment.
- Support daily operations and performance of the Databricks platform.
- ABOUT YOU
- 3+ years of experiences in Data Engineering.
- Experience with Databricks, Unity Catalog, Apache Spark, and distributed data processing.
- Strong proficiency in Python, PySpark, SQL.
- Knowledge of data warehousing concepts, data modeling, and performance optimization.
- Experience with Azure cloud data platforms (e.g., Azure Synapse).
- Familiarity with CI/CD and version control (Git, BitBucket).
- Understanding of real-time data streaming and tools such as Qlik for replication.
- Academic background: Bachelor's or Master's in Computer Science, Engineering, or related field.
- Fluent English. Another language is a plus.
- You have excellent problem-solving skills and can work independently as well as in a team.
- WHY AMARIS?
- Global Diversity: Be part of an international team of 110+ nationalities, celebrating diverse perspectives and collaboration.
- Trust and Growth: With 70% of our leaders starting at entry-level, we're committed to nurturing talent and empowering you to reach new heights.
- Continuous Learning: Unlock your full potential with our internal Academy and over 250 training modules designed for your professional growth.
- Vibrant Culture: Enjoy a workplace where energy, fun, and camaraderie come together through regular afterworks, team-building events, and more.
- Meaningful Impact: Join us in making a difference through our CSR initiatives, including the WeCare Together program, and be part of something bigger.
- Equal opportunity
- Amaris Consulting is proud to be an equal opportunity workplace. We are committed to promoting diversity within the workforce and creating an inclusive working environment. For this purpose, we welcome applications from all qualified candidates regardless of gender, sexual orientation, race, ethnicity, beliefs, age, marital status, disability or other characteristics.
ทักษะ:
Compliance, AutoCAD
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Assist in the design and development of mechanical systems including HVAC, Fire protection, and Hydraulic systems tailored for data center needs.
- Help plan and manage maintenance activities for mechanical systems, ensuring adherence to industry standards and operational efficiency.
- Maintain accurate records of mechanical system designs, maintenance activities, and compliance with safety regulations.
- Participate in site inspections to assess mechanical systems' condition and ensure compliance with design specifications.
- Assist in coordinating with third-party vendors for maintenance and upgrades, ensuring that all work meets established standards.
- Be available to respond to on-site incidents and assist senior engineers in troubleshooting mechanical failures.
- Engage in ongoing training and professional development opportunities to stay updated on the latest technologies in the data center industry.
- Job QualificationsBachelor s degree in mechanical engineering or a related field is required.
- more than 5 years of experience in mechanical engineering, preferably within a data center or critical environment.
- Basic understanding of HVAC systems, mechanical design principles, and relevant software tools (e.g., AutoCAD).
- Strong problem-solving abilities to identify issues and propose effective solutions.
- Good verbal and written communication skills for effective collaboration with team members and vendors.
- Ability to work well within a team environment while also being capable of taking initiative when necessary.
- Fluent in English both written and verbal (Minimum 750 TOEIC score).
- Goal-Oriented, Unity, Learning, Flexible.
ทักษะ:
Compliance, Python, SQL
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Lead System Analyst assigns to work on Finance-Risk and Compliance Data initiatives to support solution design and data integration between upstream applications, downstream applications, and business users.
- To design, build, and operate reliable data pipelines across batch, near-real-time, and real-time workloads.
- To utilize multiple technologies (e.g. Python, SQL/Stored Procedures, ETL/ELT tools) to ingest, transform, and deliver governed, audit-ready data.
- To orchestrate and monitor jobs, implement data quality controls, and ensure security, lineage, and observability, while modernizing existing workflows with automation, testing, and performance tuning.
- Build and maintain ingestion, transformation, and delivery pipelines that produce governed, audit-ready datasets.
- Use Python, SQL/Stored Procedures, and ETL/ELT frameworks (or any relevant technologies) to implement scalable and reusable data pipeline components.
- Orchestrate and monitor workloads (e.g., DAGs/schedulers), ensuring reliability, idempotency and rerun ability.
- Enforce data quality (completeness, validity, accuracy, timeliness, uniqueness) and reconciliation checks.
- Ensure security and compliance: access control, PII handling, encryption, and audit logging.
- Design and manage workflow orchestration for reliable execution, monitoring, and failure recovery with Airflow/Control-M/ESP (DAGs, retries, backfills, idempotency).
- Collaborate with Architects/Stewards to apply a Shared Canonical Model (CDM) and data standards.
- Implement security controls (RBAC/ABAC), PII masking, encryption in-transit/at-rest, and auditable logs.
- Maintain runbooks, technical specifications (e.g. data mapping), and contribute to CI/CD (Git, artifacts, release notes).
- Monitor pipelines (SLIs/SLOs), diagnose incidents, and drive continuous performance and cost improvements.
- Promote data literacy and a data-driven culture through cross-functional collaboration..
- Apply now if you have these advantages.
- Bachelor's / Master degree in Computer Engineer, Computer Science, Information Technology, or related fields.
- At least 8-12 years as System Analyst / Data Engineer, 2-3 years in banking industry.
- Strong background in one or more: large-scale data processing, data infrastructure engineering, or data modeling.
- Solid grasp of CDC patterns, schema-drift control, robust error handling, and recovery/replay.
- Proven track record improving pipelines via automation, testing, and performance tuning.
- Exposure to cloud data platforms (AWS/Azure/GCP), Databricks/Spark Structured Streaming is a plus.
- Proficient in Python and SQL (or any relevant programming languages) and be able to apply solid software engineering practices (testing, version control, code reviews).
- Strong SQL (complex queries, optimization) and Python (DB-API/pandas or PySpark) comfortable with Unix shell.
- Experience with one or more: Talend, IBM DataStage, Airflow, Kafka, Spark, Trino/Presto.
- Curious, resilient, and critical thinker, open to feedback and continuous improvement.
- Financial services, risk and regulatory data experience (e.g., IECC, IFRS9, Basel, BOT, AML, Credit Risk, Compliance) is an advantage.
- Why join Krungsri?.
- As a part of MUFG (Mitsubishi UFJ Financial Group), we a truly a global bank with networks all over the world.
- We offer a striking work-life balance culture with hybrid work policies (2 days minimum in office per week).
- Unbelievable benefits such as attractive bonuses, employee loan with special rates and many more..
- Apply now before this role is close. **.
- FB: Krungsri Career(http://bit.ly/FacebookKrungsriCareer [link removed]).
- LINE: Krungsri Career (http://bit.ly/LineKrungsriCareer [link removed]).
- Talent Acquisition Department
- Bank of Ayudhya Public Company Limited
- 1222 Rama III Rd., Bangpongpang, Yannawa, Bangkok 10120.
- หมายเหตุ ธนาคารมีความจำเป็นและจะมีขั้นตอนการตรวจสอบข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของผู้สมัคร ก่อนที่ผู้สมัครจะได้รับการพิจารณาเข้าร่วมงานกับธนาคารกรุงศรีฯ.
- Remark: The bank needs to and will have a process for verifying personal information related to the criminal history of applicants before they are considered for employment with the bank..
- Applicants can read the Personal Data Protection Announcement of the Bank's Human Resources Function by typing the link from the image that stated below.
- EN (https://krungsri.com/b/privacynoticeen).
- ผู้สมัครสามารถอ่านประกาศการคุ้มครองข้อมูลส่วนบุคคลส่วนงานทรัพยากรบุคคลของธนาคารได้โดยการพิมพ์ลิงค์จากรูปภาพที่ปรากฎด้านล่าง.
- ภาษาไทย (https://krungsri.com/b/privacynoticeth).
ประสบการณ์:
4 ปีขึ้นไป
ทักษะ:
Electrical Engineering, Mechanical Engineering, Excel, English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Supervise contractors who perform servicing or preventive maintenance.
- Perform limited maintenance tasks to include: filter changes, battery system PMs, and Rack PDU & Rack ATS replacements.
- Perform root cause analysis for operational issues.
- Troubleshoot facility and rack level events.
- Ensure all personnel on-site follow safety protocols.
- Work on-call and a rotating schedule as needed.
- Take daily operational readings and provide metrics reporting to senior engineers.
- Perform basic support concepts such as ticketing systems, root cause analysis, and task prioritization.
- Diverse Experiences
- AWS values diverse experiences. Even if you do not meet all of the qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn t followed a traditional path, or includes alternative experiences, don t let it stop you from applying.
- Why AWS?
- Amazon Web Services (AWS) is the world s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating that s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses.
- Inclusive Team Culture
- AWS values curiosity and connection. Our employee-led and company-sponsored affinity groups promote inclusion and empower our people to take pride in what makes us unique. Our inclusion events foster stronger, more collaborative teams. Our continual innovation is fueled by the bold ideas, fresh perspectives, and passionate voices our teams bring to everything we do.
- Mentorship & Career Growth
- We re continuously raising our performance bar as we strive to become Earth s Best Employer. That s why you ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional.
- Work/Life Balance
- We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there s nothing we can t achieve in the cloud.
- BASIC QUALIFICATIONS.
- Associates Degree or Technical Degree in Electrical Engineering, Mechanical Engineering or relevant discipline.
- Fluent in English language, both written and spoken.
- 2+ working in a Data Center or Mission Critical Environment.
- PREFERRED QUALIFICATIONS.
- Bachelor s Degree in Electrical Engineering, Mechanical Engineering or relevant discipline.
- 4+ years of Data Center Operation Experience.
- Fundamental knowledge of network design and layout as well as low voltage (copper/ fiber) cabling.
- 2+ with Microsoft Excel, Word, and Outlook.
- Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you re applying in isn t listed, please contact your Recruiting Partner.
ทักษะ:
Research, Automation, Statistics
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Work on Data Architecture. They use a systematic approach to plan, create, and maintain data architectures while also keeping it aligned with business requirements.
- Collect Data. Before initiating any work on the database, they have to obtain data from the right sources. After formulating a set of dataset processes, data engineers store optimized data.
- Conduct Research. Data engineers conduct research in the industry to address any issues that can arise while tackling a business problem.
- Automate Tasks. Data engineers dive into data and pinpoint tasks where manual participation can be eliminated with automation..
- Automate Tasks. Data engineers dive into data and pinpoint tasks where manual participation can be eliminated with automation.
- Bachelor Degree in IT, computer science, statistics, mathematics, business, or related field.
- Minimum of 5 years' experience in data engineer roles.
- Experience in the data analytics lifecycle including problem identification, measurement/matrix, exploratory data analysis and data insight presentation.
- Experience with data tools and languages like CLOUD, Python, Java similar.
- Experience with data processing, ETL and workflow and messaging queue like Kafka.
- Data Warehousing.
- Data Structure.
- ETL Tools Programming Languages (Python, Java, Pyspark).
ทักษะ:
Compliance, ETL, Python
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Design and implement frameworks for data quality validation, including rules, thresholds, and metrics to ensure high-quality data.
- Automate data quality monitoring and anomaly detection using tools or custom scripts.
- Collaborate with data stewards to resolve data quality issues and improve processes.
- Develop and maintain centralized metadata repositories to ensure accurate and up-to-date metadata.
- Automate metadata extraction, data lineage tracking, and validation processes.
- Integrate metadata management solutions with tools like Informatica EDC, Axon, or similar platforms.
- Work closely with the Data Governance team to implement data governance policies and data standards across the data lifecycle.
- Create and maintain pipelines to enable data lineage, audit trails, and compliance reporting.
- Support the adoption of governance tools and frameworks to ensure consistent data usage.
- Build robust, scalable, and secure ETL/ELT pipelines for structured and unstructured data.
- Optimize data ingestion, transformation, and storage for hybrid (on-premises and on cloud) environments.
- Ensure data pipeline reliability and performance through monitoring and testing.
- Work closely with data stewards, data analysts, and data governance teams to align engineering efforts with business needs.
- Mentor junior team members on best practices in data engineering and governance.
- 5-10 years of experience in data engineering, with a focus on data quality, metadata, data lineage and data governance..
- Strong programming skills in Python, SQL, or Scala; experience with modern data frameworks such as PySpark and dbt (Data Build Tool) is a plus..
- Expertise in ETL/ELT tools like Informatica Intelligent Data Management Cloud (IDMC), Talend, Apache NiFi, or similar cloud-native solutions..
- Proficiency with metadata management platforms such as Informatica EDC, Collibra, or Alation, including automation of metadata ingestion, classification, and lineage mapping..
- Hands-on experience with data quality tools (e.g., Informatica Data Engineering Quality (DEQ), Collibra Data Quality & Observability, Great Expectations) and custom validation scripting..
- Strong knowledge of data governance frameworks and tools (e.g., Informatica Axon, Collibra Governance, Alation Data Governance)..
- Experience with cloud data platforms and databases (e.g., Snowflake, Databricks, Oracle, SQL Server), as well as data lake/lakehouse architectures..
- Familiarity with multi-cloud and hybrid environments (e.g., AWS, Azure, Google Cloud Platform) and their native data services..
- Metadata automation (auto-tagging, semantic enrichment, and data catalog population).
- Anomaly detection in pipelines and datasets using AI-driven observability tools.
- Data quality improvement through AI-based rules generation, pattern recognition, and automated remediation suggestions...
- Strong problem-solving and analytical abilities.
- Excellent communication and collaboration skills.
- Ability to mentor and guide junior team members..
ทักษะ:
Internal Audit, Assurance, Risk Management, English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Manage and lead a team of data analysts in applying advanced analytics to audit and assurance activities.
- Drive the use of AI/ML techniques (e.g., anomaly detection, behavioral modeling, predictive risk scoring) to identify suspicious or unusual patterns of activity.
- Collaborate with auditors to integrate data-driven insights into audit planning, execution, and reporting.
- Promote innovation by introducing new tools, methods, and technologies into the audit analytics function.
- Act as a trusted advisor to senior management on emerging risks, fraud trends, and digital transformation in audit.
- Support the Department Head in preparing Thai and English versions of Monthly Reports and Quarterly Reports for SMT, the Audit Committee, and others.
- Timely perform ad-hoc assignments (projects or special audits) assigned by the Department Head or higher. This requires holistic, initiative, innovative, and analytical skills to perform the assigned tasks timely, with good quality and added value.
- People Development - develop and improve the skills, knowledge, and expertise of team members by providing them with appropriate training courses and advice during work review.
- Support the Department Head by serving as an expert in data analytics using modern auditors' tools and providing knowledge of various data sources for the Internal Audit Group.
- Bachelor s degree in business, mathematics, computer science, or management information systems.
- 8 years of relevant/recent data analysis experience in audit, financial, risk management, or technology functions.
- Strong quantitative, analytical, data intuition, and problem-solving skills, and proficiency in data analytics techniques.
- Hands-on knowledge of AI/ML applications in risk, fraud, or audit contexts.
- Familiarity with audit methodologies, risk frameworks, and financial industry regulations.
- Able to manage a team under pressure and deadlines.
- Proficient in computer literacy, especially tools for data analytics.
- Good command of English, both written and spoken.
- Talent Acquisition Department Bank of Ayudhya Public Company Limited.
- 1222 Rama III Rd., Bangpongpang, Yannawa, Bangkok 10120.
- Applicants can read the Personal Data Protection Announcement of the Bank's Human Resources Function by typing the link from the image that stated below.
- (https://krungsri.com/b/privacynoticeen).
- Remark: The bank needs to and will have a process for verifying personal information related to the criminal history of applicants before they are considered for employment with the bank.
- Only shortlisted candidates will be contacted"
- FB: Krungsri Career.
- LINE: Krungsri Career.
ประสบการณ์:
1 ปีขึ้นไป
ทักษะ:
SQL, Python, Automation, English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Developing data pipeline and gather into internal tables using SQL, Inhouse-tools and Python.
- Collaborate with stakeholders to prepare, process, and validate data for business needs.
- Make recommendations on improvement, maintenance, or other factors to improve the database system.
- Develop reports, dashboards, and automation solutions using SPARK, SQL, Python, Excel, and in-house tools.
- Ensure data integrity by sanitizing, validating, and aligning numbers with accurate logic.
- Requirements: Master's or bachelor's degree in quantitative fields or a relevant field.
- 1-3 years of experience in Data Analytics, Data Engineering, or Business Intelligence.
- Experience in project & stakeholder management responsibilities.
- Strong SQL skills for data querying and Excel proficiency (Python is a plus).
- Strong English communication skills, both verbal and written.
- Detail-oriented and enjoy building from the ground up.
- Fresh Graduates are welcome.
ทักษะ:
SAP, Excel, Power BI
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- วิเคราะห์ข้อมูลความต้องการของตลาด (Demand) และแนวโน้มการขาย เพื่อจัดทำ Demand Forecast ที่แม่นยำ.
- รวบรวมและตรวจสอบข้อมูลจากฝ่ายการขาย การตลาด และฝ่ายวางแผนการผลิต เพื่อให้การคาดการณ์สอดคล้องกัน.
- วางแผนและควบคุมปริมาณสินค้าคงคลัง (Inventory) ให้เหมาะสมกับความต้องการของตลาด.
- ประสานงานกับฝ่ายผลิต ฝ่ายจัดซื้อ และฝ่ายโลจิสติกส์ เพื่อให้มีสินค้าพร้อมจำหน่ายตามแผน.
- วิเคราะห์ความเสี่ยงและโอกาสด้านอุปสงค์และอุปทาน พร้อมเสนอแนวทางแก้ไขเมื่อมีความคลาดเคลื่อนระหว่างแผนและความเป็นจริง.
- จัดทำรายงานการคาดการณ์ยอดขาย สถานะสินค้า และประสิทธิภาพการวางแผนเพื่อเสนอผู้บริหาร.
- ปรับปรุงกระบวนการวางแผนด้วยการใช้เครื่องมือ Data Analytics และระบบ SAP.
- วุฒิปริญญาตรีขึ้นไป สาขา Supply Chain & Logistics Management, Business Analytics, Economics, Engineering หรือสาขาที่เกี่ยวข้อง.
- มีประสบการณ์ด้าน Demand & Supply Analyst, หรือ Supply Chain Data Analyst อย่างน้อย 3-5 ปี.
- มีทักษะในการวิเคราะห์ข้อมูล (Excel, Power BI, Python).
- มีความเข้าใจระบบ ERP (เช่น SAP, Oracle).
ทักษะ:
SAP, Excel, Power BI
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- วิเคราะห์ข้อมูลความต้องการของตลาด (Demand) และแนวโน้มการขาย เพื่อจัดทำ Demand Forecast ที่แม่นยำ.
- รวบรวมและตรวจสอบข้อมูลจากฝ่ายการขาย การตลาด และฝ่ายวางแผนการผลิต เพื่อให้การคาดการณ์สอดคล้องกัน.
- วางแผนและควบคุมปริมาณสินค้าคงคลัง (Inventory) ให้เหมาะสมกับความต้องการของตลาด.
- ประสานงานกับฝ่ายผลิต ฝ่ายจัดซื้อ และฝ่ายโลจิสติกส์ เพื่อให้มีสินค้าพร้อมจำหน่ายตามแผน.
- วิเคราะห์ความเสี่ยงและโอกาสด้านอุปสงค์และอุปทาน พร้อมเสนอแนวทางแก้ไขเมื่อมีความคลาดเคลื่อนระหว่างแผนและความเป็นจริง.
- จัดทำรายงานการคาดการณ์ยอดขาย สถานะสินค้า และประสิทธิภาพการวางแผนเพื่อเสนอผู้บริหาร.
- ปรับปรุงกระบวนการวางแผนด้วยการใช้เครื่องมือ Data Analytics และระบบ SAP.
- วุฒิปริญญาตรีขึ้นไป สาขา Supply Chain & Logistics Management, Business Analytics, Economics, Engineering หรือสาขาที่เกี่ยวข้อง.
- มีประสบการณ์ด้าน Demand & Supply Analyst, หรือ Supply Chain Data Analyst อย่างน้อย 3-5 ปี.
- มีทักษะในการวิเคราะห์ข้อมูล (Excel, Power BI, Python).
- มีความเข้าใจระบบ ERP (เช่น SAP, Oracle).
ทักษะ:
Industrial Engineering, Finance, Project Management
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Insight Generation & Strategic Recommendations: Utilize deep knowledge of market trends, financial data, and process standards to generate actionable insights. Develop analysis models and provide strategic recommendations that drive business decisions and improvements in operations efficiency.
- Business Analysis & Reporting: Collaborate with senior management to analyze internal and external data sources. Develop comprehensive reports and presentations that support key initiatives and guide the business in trial projects and rollouts.
- Model Development & Data Integrity: Develop and maintain robust analytical models that support business analysis. Ensure data integrity and create best-practice reports based on thorough data analysis and visualization.
- Stakeholder Collaboration & Facilitation: Lead and facilitate collaboration across the business, ensuring alignment and engagement among stakeholders. Work closely with process and budget owners to achieve business objectives and implement strategic recommendations.
- Project Tracking & Risk Mitigation: Regularly monitor and update project progress, ensuring milestones are met. Develop and execute mitigation plans for any critical issues, ensuring the successful delivery of projects.
- Consultation & End-User Support: Provide initial consultation and tailored solutions to stakeholders based on business analysis. Occasionally develop dashboards and train end users to ensure effective use of insights and tools.
- 2-5 years+ working experience in Data Analytics, Process Improvement.
- Master's Degree in Industrial Engineering, Supply Chain, Finance, IT or related field.
- Knowledge of basic statistical techniques for hypothesis testing or prediction (Correlation, Regression, etc.).
- Skilled in process improvement, project management, and data analysis.
- Ability to use data and metrics to test theories, back up assumptions, develop business cases, complete root cause analysis and measure success.
- Ability to work independently and under pressure with business-partnering mindset.
- Good presentation, communication and influencing skills.
- Able to work 2 sites' office ( Head office - Phattanakarn, and Central office - Nawamin office).
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
ทักษะ:
SQL, Tableau, Power BI
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Data Cleaning and Preparation - Need to retrieve data from one or more sources and prepare the data so it is ready for numerical and categorical analysis. Data cleaning also involves handling missing and inconsistent data that may affect your analysis.
- Data Analysis and Exploration - Take a business question or need and turn it into a data question. Then, transform and analyze data to extract an answer to that question. Moreover, find interesting trends or relationships in the data that could bring value to a business.
- Creating Data Visualizations and Communication - Produce reports or build dashboards on your findings and communicate to business stakeholders and managements.
- Statistical Knowledge.
- Mathematical Ability.
- Programming languages, such as SQL.
- Analytic tools such as Tableau, Power BI.
- TeraData, Big data Hadoop Tech, Cloud Tech.
- Bachelor Degrees in MIS, Business, Economic, Computer Science or related field.
- At least 2-3 year of experience with Data Analysis.
- Experienced in designing and architecture BI / Data Analytics Solutions is preferred.
- 1
- 2
- 3
- 4
- 5
- 6
- 17