- No elements found. Consider changing the search query.
Skills:
Sales, Hadoop, ETL, English
Job type:
Full-time
Salary:
negotiable
- Bachelor's degree or equivalent practical experience.
- 10 years of experience in software sales or account management.
- Experience promoting analytics, data warehousing, or data management software.
- Ability to communicate fluently in English and Thai to support APAC customers.
- Experience with business intelligence front-end, data analytics middleware, or back-end data warehouse technologies.
- Experience working with sales engineers and customer technical leads to build business cases for transformation and accompanying plans for implementation.
- Understanding of data analytics technology stack (e.g., Hadoop/Spark, Columnar data warehouses, data streaming, ETL and data governance, predictive analytics, data science framework, etc.).
- Understanding of Google Cloud Data and Analytics offerings (e.g., BigQuery, Looker, Dataproc, Pub/Sub, etc.).
- Ability to engage and influence executive stakeholders as a business advisor and thought leader in data and analytics.
- Excellent business acumen and problem-solving skills.
- As a member of the Google Cloud team, you inspire leading companies, schools, and government agencies to work smarter with Google tools like Google Workspace, Search, and Chrome. You advocate for the innovative power of our products to make organizations more productive, collaborative, and mobile. Your guiding light is doing what s right for the customer, you will meet customers exactly where they are at and provide them the best solutions for innovation. Using your passion for Google products, you help spread the magic of Google to organizations around the world.
- In this role, you will build an understanding of our customers businesses and bring expertise to executive-level relationships to help them deliver their strategies. You will leverage expertise promoting data analytics and work with account teams, customer engineering, and partners to ensure customer outcomes.
- Google Cloud accelerates every organization s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems.
- Calibrate business against the objectives and results, forecast and report the state of the business for the assigned territory.
- Build and maintain executive relationships with customers as the data analytics subject matter expert, influencing direction.
- Develop and execute account plans, including a broader enterprise plan across industries. Focus on building accounts.
- Assist customers in identifying use cases suitable for Google Cloud Data and Analytics solutions, articulating solution differentiation and business impacts.
- Work with Google account and technical teams to develop and drive pipelines, and provide expertise. Develop Go-To-Market (GTM) efforts with Google Cloud Platform partners.
- Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form.
Skills:
Big Data, SQL, Hadoop
Job type:
Full-time
Salary:
negotiable
- Develop and maintain robust data pipelines to ingest, process, and transform raw data into formats suitable for LLM training.
- Conduct meeting with users to understand the data requirements and perform database design based on data understanding and requirements with consideration for performance.
- Maintain data dictionary, relationship and its interpretation.
- Analyze problem and find resolution, as well as work closely with administrators to monitor performance and advise any necessary infrastructure changes.
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Develop big data solutions for batch processing and near real-time streaming.
- Own end-to-end data ETL/ELT process framework from Data Source to Data warehouse.
- Select and integrate appropriate tools and frameworks required to provide requested capabilities.
- Design and develop BI solutions.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Keep abreast of new developments in the big data ecosystem and learn new technologies.
- Ability to effectively work independently and handle multiple priorities.
- Bachelor degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- 3+ year's experiences in Data Management or Data Engineer (Retail or E-Commerce business is preferrable).
- Expert experience in query language (SQL), Databrick SQL, PostgreSQL.
- Experience in Big Data Technologies like Hadoop, Apache Spark, Databrick.
- Experience in Python is a must.
- Experience in Generative AI is a must.
- Knowledge in machine/statistical learning, data mining is a plus.
- Strong analytical, problem solving, communication and interpersonal skills.
- Having good attitude toward team working and willing to work hard.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
Skills:
ETL, Power BI, SQL
Job type:
Full-time
Salary:
negotiable
- รับผิดชอบการออกแบบ Data model ที่รองรับการใช้งานระยะยาว.
- ช่วยวางแผนการสร้าง Data mart ให้เหมาะกับ use case ต่างๆ.
- ทำงานร่วมกับทีม ETL / Data Engineer เพื่อจัด schema, pipeline ให้ตรงกับความต้องการ.
- สื่อสารและแปล requirement ธุรกิจมาเป็น solution เชิงข้อมูลได้ดี.
- ทำหน้าที่วิเคราะห์ข้อมูลเชิงลึกให้กับหน่วยธุรกิจ.
- ออกแบบและพัฒนา Dashboard / Report บน Power BI อย่าง advance.
- ดึงข้อมูลจาก DWH โดยใช้ SQL ที่ซับซ้อน รวมถึงการทำ data wrangling.
- มีประสบการณ์ทำงานในสายงาน Data (BA หรือ SA) 5 ปี.
- มีความเชี่ยวชาญภาษา SQL เป็นอย่างดี.
- มีความเชี่ยวชาญ Visualization Tool เช่น Power BI และหรือ Data tools อื่นๆที่เกี่ยวกับ Data.
- มีความรู้ความเข้าใจใน Data Warehouse Concept, ETL เป็นอย่างดี.
- ทักษะการออกแบบ Data model, Data Mart.
- ทักษะในการวิเคราะ/แก้ไขปัญหา.
- Contact Information:-.
- K. Sawarin Tel.
- Office of Human Capital.
- DIGITAL AND TECHNOLOGY SERVICES CO., LTD.
- F.Y.I Center 2525 Rama IV Rd, Khlong Tan, Khlong Toei, Bangkok 10110.
- MRT QSNCC Station Exit 1.
Skills:
Apache, Compliance, Automation
Job type:
Full-time
Salary:
negotiable
- Design, develop, and maintain robust and scalable data pipelines using tools such as Apache Airflow, PySpark, and cloud-native services (e.g., Azure Data Factory, Microsoft Fabric Pipelines)..
- Manage data ingestion from APIs, files, and databases into data lakes or data warehouses (e.g., Microsoft Fabric Lakehouse, Iceberg, DWS)..
- Ensure seamless data integration across on-premise, cloud, and hybrid environments..
- Implement data validation, standardization, and transformation to ensure high data quality..
- Apply data encryption, masking, and compliance controls to maintain security and privacy standards..
- AI & Intelligent AutomationCollaborate with Data Scientists to deploy ML models and integrate predictive insights into production pipelines (e.g., using Azure Machine Learning or Fabric Notebooks)..
- Support AI-powered automation and data insight generation through tools like Microsoft Co-pilot Studio or LLM-powered interfaces (chat-to-data)..
- Assist in building lightweight AI chatbots or agents that leverage existing datasets to enhance business efficiency..
- Qualifications & Skills3-5+ years of experience in Data Engineering or AI Engineering roles.
- Proficiency in Python, SQL, and big data frameworks (Apache Airflow, Spark, PySpark)..
- Experience with cloud platforms: Azure, Huawei Cloud, or AWS.
- Familiar with Microsoft Fabric services: OneLake, Lakehouse, Notebooks, Pipelines, and Real-Time Analytics..
- Hands-on with Microsoft Co-pilot Studio to design chatbots, agents, or LLM-based solutions..
- Experience in ML model deployment using Azure ML, ModelArts, or similar platforms.
- Understanding of vector databases (e.g., Qdrant), LLM orchestration (e.g., LangChain), and prompt engineering is a plus.
Experience:
4 years required
Skills:
Electrical Engineering, Mechanical Engineering, Excel, English
Job type:
Full-time
Salary:
negotiable
- Supervise contractors who perform servicing or preventive maintenance.
- Perform limited maintenance tasks to include: filter changes, battery system PMs, and Rack PDU & Rack ATS replacements.
- Perform root cause analysis for operational issues.
- Troubleshoot facility and rack level events.
- Ensure all personnel on-site follow safety protocols.
- Work on-call and a rotating schedule as needed.
- Take daily operational readings and provide metrics reporting to senior engineers.
- Perform basic support concepts such as ticketing systems, root cause analysis, and task prioritization.
- Diverse Experiences
- AWS values diverse experiences. Even if you do not meet all of the qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn t followed a traditional path, or includes alternative experiences, don t let it stop you from applying.
- Why AWS?
- Amazon Web Services (AWS) is the world s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating that s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses.
- Inclusive Team Culture
- AWS values curiosity and connection. Our employee-led and company-sponsored affinity groups promote inclusion and empower our people to take pride in what makes us unique. Our inclusion events foster stronger, more collaborative teams. Our continual innovation is fueled by the bold ideas, fresh perspectives, and passionate voices our teams bring to everything we do.
- Mentorship & Career Growth
- We re continuously raising our performance bar as we strive to become Earth s Best Employer. That s why you ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional.
- Work/Life Balance
- We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there s nothing we can t achieve in the cloud.
- BASIC QUALIFICATIONS.
- Associates Degree or Technical Degree in Electrical Engineering, Mechanical Engineering or relevant discipline.
- Fluent in English language, both written and spoken.
- 2+ working in a Data Center or Mission Critical Environment.
- PREFERRED QUALIFICATIONS.
- Bachelor s Degree in Electrical Engineering, Mechanical Engineering or relevant discipline.
- 4+ years of Data Center Operation Experience.
- Fundamental knowledge of network design and layout as well as low voltage (copper/ fiber) cabling.
- 2+ with Microsoft Excel, Word, and Outlook.
- Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you re applying in isn t listed, please contact your Recruiting Partner.
Job type:
Full-time
Salary:
negotiable
- Design and implement the methods for storing and retrieving the data and monitoring the data pipelines, starting from the ingestion of raw data sources, transforming, cleaning and storing, enrich the data for promptly using for both of structured and unstructured data, and working with the data lake and cloud which is the big data technology.
- Develop data pipeline automation using Azure technologies, Databricks and Data Factory
- Understand data, reports and dashboards requirements, develop data visualization using Power BI, Tableau by working across workstreams to support data requirements including reports and dashboards and collaborate with data scientists, data analyst, data governance team or business stakeholders on several projects
- Analyze and perform data profiling to understand data patterns following Data Quality and Data Management processes
- 3 years+ experience in big data technology, data engineering, data analytic application system development.
- Have an experience of unstructured data for business Intelligence or computer science would be advantage.
- Technical skill in SQL, UNIX and Shell Script, Python, R Programming, Spark, Hadoop programming.
Skills:
Research, ETL, Automation
Job type:
Full-time
Salary:
negotiable
- Lead the design and development of data architecture, ensuring scalability, security, and alignment with business strategy.
- Oversee the collection, transformation, and integration of data from multiple internal and external sources.
- Conduct advanced research and troubleshooting to address complex business and technical problems.
- Design, build, and optimize data pipelines and ETL processes to handle large-scale and real-time data.
- Implement automation solutions to minimize manual intervention and improve data efficiency.
- Provide technical leadership and mentorship to junior engineers, ensuring best practices in coding, testing, and deployment.
- Collaborate with cross-functional stakeholders including Data Scientists, Analysts, and Business Leaders to deliver actionable insights.
- Evaluate and recommend new tools, frameworks, and technologies to enhance data engineering capabilities.
- Job SpecificationBachelor s Degree in Information Technology, Computer Science, Statistics, Mathematics, Business, or a related field (Master s Degree is a plus).
- Minimum of 5 years experience in data engineering, with at least 2 years in a senior or lead role.
- Proven expertise in the data analytics lifecycle, including business problem framing, KPI/metrics design, exploratory analysis, and presenting data insights.
- Strong hands-on experience with cloud platforms (AWS, GCP, Azure) and advanced programming skills in Python, Java, PySpark.
- Solid knowledge of data processing, ETL frameworks, data warehousing, and messaging queue systems (e.g., Kafka).
- Demonstrated experience in designing highly scalable, resilient, and secure data systems.
Job type:
Full-time
Salary:
negotiable
- We are looking for a skilled Data Engineer to join our team and help build and maintain our data infrastructure. The ideal candidate will be responsible for designing, implementing, and managing our data processing systems and pipelines. You will work closely with data scientists, analysts, and other teams to ensure efficient and reliable data flow throughout the organization.
- Design, develop, and maintain scalable data pipelines for batch and real-time processing.
- Implement ETL processes to extract data from various sources and load it into data warehouses or data lakes.
- Optimize data storage and retrieval processes for improved performance.
- Collaborate with data scientists and analysts to understand their data requirements and provide appropriate solutions.
- Ensure data quality, consistency, and reliability across all data systems.
- Develop and maintain data models and schemas.
- Implement data security measures and access controls.
- Troubleshoot data-related issues and optimize system performance.
- Stay up-to-date with emerging technologies and industry trends in data engineering.
- Document data architectures, pipelines, and processes.
- Bachelor's degree in Computer Science, Engineering, or a related fields 2. 2-4 years of experience in data engineering or similar roles 3. Strong programming skills in Python, Java, or Scala 4. Proficiency in SQL and experience with relational databases (e.g., Databrick, PostgreSQL, MySQL) 5. Familiarity with cloud platforms (AWS, Azure, or Airflow) and their data services 6. Knowledge of data warehousing concepts and ETL best practices 7. Experience with version control systems (e.g., Git) 8. Understanding of data cleansing, data modeling and database design principles 9. Solid problem-solving skills and attention to detail 10. Good communication skills and ability to work with technical and non-technical team members.
- Experience with Azure data platform (ADF, Databrick) 2. Familiarity with data visualization tools (e.g., Tableau, Power BI) 3. Knowledge of stream processing technologies (e.g., Kafka, API, Google Big Query, MongoDB, SFTP sources) 4. Experience with containerization technologies (e.g., Docker).
- Experience to deal with large data and optimization skill in development.
- Understanding of machine learning concepts and data science workflows.
Skills:
Compliance, Python, SQL
Job type:
Full-time
Salary:
negotiable
- Lead System Analyst/Senior Data Engineer assigns to work on IECC Project and Finance-Risk and Compliance Data initiatives to support solution design and data integration between upstream applications, downstream applications, and business users.
- To design, build, and operate reliable data pipelines across batch, near-real-time, and real-time workloads.
- To utilize multiple technologies (e.g. Python, SQL/Stored Procedures, ETL/ELT tools) to ingest, transform, and deliver governed, audit-ready data.
- To orchestrate and monitor jobs, implement data quality controls, and ensure security, lineage, and observability, while modernizing existing workflows with automation, testing, and performance tuning.
- Build and maintain ingestion, transformation, and delivery pipelines that produce governed, audit-ready datasets.
- Use Python, SQL/Stored Procedures, and ETL/ELT frameworks (or any relevant technologies) to implement scalable and reusable data pipeline components.
- Orchestrate and monitor workloads (e.g., DAGs/schedulers), ensuring reliability, idempotency and rerun ability.
- Enforce data quality (completeness, validity, accuracy, timeliness, uniqueness) and reconciliation checks.
- Ensure security and compliance: access control, PII handling, encryption, and audit logging.
- Design and manage workflow orchestration for reliable execution, monitoring, and failure recovery with Airflow/Control-M/ESP (DAGs, retries, backfills, idempotency).
- Collaborate with Architects/Stewards to apply a Shared Canonical Model (CDM) and data standards.
- Implement security controls (RBAC/ABAC), PII masking, encryption in-transit/at-rest, and auditable logs.
- Maintain runbooks, technical specifications (e.g. data mapping), and contribute to CI/CD (Git, artifacts, release notes).
- Monitor pipelines (SLIs/SLOs), diagnose incidents, and drive continuous performance and cost improvements.
- Promote data literacy and a data-driven culture through cross-functional collaboration..
- Apply now if you have these advantages.
- Bachelor's / Master degree in Computer Engineer, Computer Science, Information Technology, or related fields.
- At least 8-12 years as System Analyst / Data Engineer, 2-3 years in banking industry.
- Strong background in one or more: large-scale data processing, data infrastructure engineering, or data modeling.
- Solid grasp of CDC patterns, schema-drift control, robust error handling, and recovery/replay.
- Proven track record improving pipelines via automation, testing, and performance tuning.
- Exposure to cloud data platforms (AWS/Azure/GCP), Databricks/Spark Structured Streaming is a plus.
- Proficient in Python and SQL (or any relevant programming languages) and be able to apply solid software engineering practices (testing, version control, code reviews).
- Strong SQL (complex queries, optimization) and Python (DB-API/pandas or PySpark) comfortable with Unix shell.
- Experience with one or more: Talend, IBM DataStage, Airflow, Kafka, Spark, Trino/Presto.
- Curious, resilient, and critical thinker, open to feedback and continuous improvement.
- Financial services, risk and regulatory data experience (e.g., IECC, IFRS9, Basel, BOT, AML, Credit Risk, Compliance) is an advantage.
- Why join Krungsri?.
- As a part of MUFG (Mitsubishi UFJ Financial Group), we a truly a global bank with networks all over the world.
- We offer a striking work-life balance culture with hybrid work policies (2 days minimum in office per week).
- Unbelievable benefits such as attractive bonuses, employee loan with special rates and many more..
- Apply now before this role is close. **.
- FB: Krungsri Career(http://bit.ly/FacebookKrungsriCareer [link removed]).
- LINE: Krungsri Career (http://bit.ly/LineKrungsriCareer [link removed]).
- Talent Acquisition Department
- Bank of Ayudhya Public Company Limited
- 1222 Rama III Rd., Bangpongpang, Yannawa, Bangkok 10120.
- หมายเหตุ ธนาคารมีความจำเป็นและจะมีขั้นตอนการตรวจสอบข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของผู้สมัคร ก่อนที่ผู้สมัครจะได้รับการพิจารณาเข้าร่วมงานกับธนาคารกรุงศรีฯ.
- Remark: The bank needs to and will have a process for verifying personal information related to the criminal history of applicants before they are considered for employment with the bank..
- Applicants can read the Personal Data Protection Announcement of the Bank's Human Resources Function by typing the link from the image that stated below.
- EN (https://krungsri.com/b/privacynoticeen).
- ผู้สมัครสามารถอ่านประกาศการคุ้มครองข้อมูลส่วนบุคคลส่วนงานทรัพยากรบุคคลของธนาคารได้โดยการพิมพ์ลิงค์จากรูปภาพที่ปรากฎด้านล่าง.
- ภาษาไทย (https://krungsri.com/b/privacynoticeth).
Skills:
Compliance, AutoCAD
Job type:
Full-time
Salary:
negotiable
- Assist in the design and development of mechanical systems including HVAC, Fire protection, and Hydraulic systems tailored for data center needs.
- Help plan and manage maintenance activities for mechanical systems, ensuring adherence to industry standards and operational efficiency.
- Maintain accurate records of mechanical system designs, maintenance activities, and compliance with safety regulations.
- Participate in site inspections to assess mechanical systems' condition and ensure compliance with design specifications.
- Assist in coordinating with third-party vendors for maintenance and upgrades, ensuring that all work meets established standards.
- Be available to respond to on-site incidents and assist senior engineers in troubleshooting mechanical failures.
- Engage in ongoing training and professional development opportunities to stay updated on the latest technologies in the data center industry.
- Job QualificationsBachelor s degree in mechanical engineering or a related field is required.
- more than 5 years of experience in mechanical engineering, preferably within a data center or critical environment.
- Basic understanding of HVAC systems, mechanical design principles, and relevant software tools (e.g., AutoCAD).
- Strong problem-solving abilities to identify issues and propose effective solutions.
- Good verbal and written communication skills for effective collaboration with team members and vendors.
- Ability to work well within a team environment while also being capable of taking initiative when necessary.
- Fluent in English both written and verbal (Minimum 750 TOEIC score).
- Goal-Oriented, Unity, Learning, Flexible.
Skills:
Tableau, Power BI, Excel
Job type:
Full-time
Salary:
negotiable
- Leads and improves analytical programs and presents the results.
- Analyzes the variable granular data of the business and regularly uses advanced models to deliver business insights across diverse business domains.
- Answers and anticipates critical business questions and opportunities and delivers insights to the business in ways that make significant impact.
- Demonstrates use of data visualization (Tableau, Power BI, Excel), and analytic tools (Python, R, SQL, KNIME/Alteryx) to grasp the business insights from mountains of data.
- Collaborates with multifunctional teams (Operations, Initiatives, and IT etc.) to improve key business priorities based on data-driven insights.
- Lead the roll-out and development of digital solution per business needs.
Job type:
Full-time
Salary:
negotiable
- The Senior Data Engineer position plays a key role in designing, developing, and managing cloud-based data platforms, as well as creating data structures for high-level data analysis, and works with business and technical teams to ensure that data management is appropriate and supports organizational goals.
- Responsible for the design, construction, and maintenance of optimal and scalable data pipeline architectures on cloud platforms (e.g., GCP, AWS, Azure).
- Oversee the development and management of complex ETL/ELT processes for data ingesti ...
- Author and optimize advanced, high-performance SQL queries for complex data transformation, aggregation, and analysis.
- Leverage the Python programming language for automation, scripting, and the development of data processing frameworks.
- Administer and optimize cloud-based data warehouse solutions and associated data lakes.
- Collaborate professionally with data scientists, analysts, and key business stakeholders to ascertain data requirements and deliver effective technical solutions.
- Provide mentorship to junior engineers and champion the adoption of data engineering best practices throughout the organization.
- Bachelor s degree or higher in Computer Science, Information Technology, Engineering, or a related field.
- At least 5 years of experience working in a data engineering or related position.
- Proficient in advanced SQL, including query optimization and performance tuning.
- Experienced in managing and designing architecture on at least one major cloud platform (Google Cloud Platform, AWS, or Azure).
- Skilled in using Python for data processing and advanced pipeline development.
- Experienced with tools and technologies for data ingestion, connectivity, and management.
- Deep understanding of data modeling principles, data warehousing methodologies, and modern data architecture.
- Excellent analytical and problem-solving skills.
- Communication and teamwork skills.
- Ability to plan and manage tasks effectively.
Skills:
DevOps
Job type:
Full-time
Salary:
negotiable
- FB: Krungsri Career(http://bit.ly/FacebookKrungsriCareer).
- LINE: Krungsri Career (http://bit.ly/LineKrungsriCareer).
- LINKEDIN: Krungsri (http://bit.ly/LinkedinKrungsri).
- Applicants can read the Personal Data Protection Announcement of the Bank's Human Resources Function by typing the link from the image that stated below.
- EN.
- (https://krungsri.com/b/privacynoticeen).
- The bank needs to and will have a process for verifying personal information related to the criminal history of applicants before they are considered for employment with the bank.
Job type:
Full-time
Salary:
negotiable
- Cloud Infrastructure.
- Design, deploy, and manage cloud infrastructure (AWS, Azure).
- Administer Azure Entra ID (Azure AD), hybrid identity solutions, MFA, and conditional access policies.
- Manage AWS services such as EC2, S3, VPC, CloudFront, WAF, API Gateway (e.g., KONG), Docker, and Kubernetes.
- Migrate workloads and applications to the cloud with minimal downtime.
- Ensure seamless integration between on-premise systems and cloud platforms.
- Provide cost optimization recommendations for infrastructure.
- Monitor system availability, performance, and security using AWS monitoring tools.
- Design network diagrams and plan infrastructure growth.
- Identity & Security Management.
- Configure and manage Active Directory Federation Services (ADFS) for SSO.
- Handle SSL/TLS certificate lifecycle: issuance, deployment, renewal, and revocation.
- Manage security solutions including WAF, Reverse Proxy, and Application Gateway..
- Education: Bachelor s degree or higher in Computer Science, Information Technology, or related fields.
- Experience in 3-5 years in on-premise and cloud infrastructure, 2-3 years with AWS cloud services.
- Proficiency in VMware, Veeam Backup, ADDS/ADFS, Azure AD, and certificate management.
- Proven ability to design, evaluate, perform POC, and implement cloud solutions.
- Strong troubleshooting skills for both cloud and on-premise systems.
- Willingness to work nights/weekends as required.
- Self-motivated, with a commitment to continuous learning.
- Strong communication, presentation, problem-solving, and negotiation skills.
Experience:
1 year required
Skills:
SQL, Python, Automation, English
Job type:
Full-time
Salary:
negotiable
- Developing data pipeline and gather into internal tables using SQL, Inhouse-tools and Python.
- Collaborate with stakeholders to prepare, process, and validate data for business needs.
- Make recommendations on improvement, maintenance, or other factors to improve the database system.
- Develop reports, dashboards, and automation solutions using SPARK, SQL, Python, Excel, and in-house tools.
- Ensure data integrity by sanitizing, validating, and aligning numbers with accurate logic.
- Requirements: Master's or bachelor's degree in quantitative fields or a relevant field.
- 1-3 years of experience in Data Analytics, Data Engineering, or Business Intelligence.
- Experience in project & stakeholder management responsibilities.
- Strong SQL skills for data querying and Excel proficiency (Python is a plus).
- Strong English communication skills, both verbal and written.
- Detail-oriented and enjoy building from the ground up.
- Fresh Graduates are welcome.
Job type:
Full-time
Salary:
negotiable
- Installing, configuring and maintaining hardware, network and operation system of the IT system, both of cloud and on-premises infrastructures.
- Installing, configuring and maintaining software application and platforms, both of cloud and on-premises infrastructures.
- Managing users, groups and operating system policies across the entire IT network.
- Diagnosing, troubleshooting and resolving application, software, hardware and networking issues.
- Monitoring system and application performance, availability, and security.
- Automate tasks like code deployment, testing, and infrastructure provisioning through scripts, tools, and CI/CD pipelines.
- Replacing and upgrading outdated or defective components.
- Enforcing security best practices to prevent cyber attacks and security breaches..
- 7-8 years of experience in system, application or production support.
- Strong experienced in UNIX, LINUX and Windows operating systems.(System Administrator role).
- Knowledge of networking fundamentals (TCP/IP, routing, network protocols, configurations, and security practices).
- Knowledge of cloud platforms (Azure, OpenShift, Kubernetes) and DevOps/DevSecOps practices.
- Good knowledge of RESTful APIs, HTTP protocol, OAuth, and JSON.
- Experience supporting Java-based APIs is an advantage.
- Familiarity with database technologies (Oracle, MS SQL Server, PostgreSQL, MongoDB, MySQL).
- Experience with monitoring and logging tools (e.g., Dynatrace, Kibana, Elasticsearch, Grafana).
- Knowledge of scripting tool and automation (Shell Script, PowerShell, ansible playbook, CI/CD pipeline).
- Knowledge of distributed event streaming platforms is a plus.
- Proven ability to perform root cause analysis and problem diagnosis in collaboration with development teams.
Skills:
Digital Marketing, Big Data, Statistics
Job type:
Full-time
Salary:
negotiable
- Drive clear and effective business translation of AI/ML products between business and technical stakeholders.
- Design, develop and leverage Advanced analytics, Artificial Intelligence (AI) and Machine Learning (ML) models to support digital marketing, MarTech, AdTech, and hyper-personalization initiatives.
- Analyze Big Data to develop effective predictive and recommendation models.
- Collaborate closely with Product Owners, IT teams, and Data teams to implement AI solutions that improve marketing campaign performance.
- Continuously refine and enhance AI models through testing and performance evaluation.
- Participate in the vendor selection processes to identify and ensure the best external partners for data science and AI/MLprojects..
- Bachelor s Degree or higher in Computer Science, Computer Engineering, Data Science, Statistics, or any related field.
- Minimum of 2 years in AI/ML engineer, cloud solution or a related field.
- Proficiency in some of the following: Python, PySpark and SQL etc.
- Experience or strong interest in digital marketing, MarTech, and AdTech, especially data-driven marketing strategies is a plus.
- Experience in building tools / models to support retention, up-cross selling, optimization, mobile app data and digital marketing is a plus.
- Ability to communicate and collaborate with cross-functional teams.
- Growth mindset and openness to continuously learning and facing new projects and new technologies..
- You have read and reviewed Infinitas By Krungthai Company Limited's Privacy Policy at https://krungthai.com/Download/download/DownloadDownload_73Privacy_Policy_Infinitas.pdf. The Bank does not intend or require the processing of any sensitive personal data, including information related to religion and/or blood type, which may appear on copy of your identification card. Therefore, please refrain from uploading any documents, including copy(ies) of your identification card, or providing sensitive personal data or any other information that is unrelated or unnecessary for the purpose of applying for a position on the website. Additionally, please ensure that you have removed any sensitive personal data (if any) from your resume and other documents before uploading them to the website.
- The Bank is required to collect your criminal record information to assess employment eligibility, verify qualifications, or evaluate suitability for certain positions. Your consent to the collection, use, or disclosure of your criminal record information is necessary for entering into an agreement and being considered for the aforementioned purposes. If you do not consent to the collection, use, or disclosure of your criminal record information, or if you later withdraw such consent, the Bank may be unable to proceed with the stated purposes, potentially resulting in the loss of your employment opportunity with.".
Skills:
ETL, Automation, Data Warehousing
Job type:
Full-time
Salary:
negotiable
- Design & Implement Data Platforms: Design, develop, and maintain robust, scalable data pipelines and ETL processes, with a focus on automation and operational excellence.
- Ensure Data Quality and Governance: Implement automated data validation, quality checks, and monitoring systems to ensure data accuracy, consistency, and reliability.
- Manage CI/CD for Data: Own and optimize the CI/CD pipelines for data engineering workflows, including automated testing and deployment of data transformations and schem ...
- Architect & Implement IaC: Use Infrastructure as Code (IaC) with Terraform to manage data infrastructure across various cloud platforms (Azure, AWS, GCP).
- Performance & Optimization: Proactively monitor and optimize query performance, data storage, and resource utilization to manage costs and enhance efficiency.
- Collaborate with Stakeholders: Manage communication with technical and business teams to understand requirements, assess technical and business impact, and deliver effective data solutions.
- Strategic Design: Possess the ability to see the big picture in architectural design, conduct thorough risk assessments, and plan for future scalability and growth.
- Experience: 1-3 years of experience in data engineering, data warehousing, and ETL processes, with a significant portion of that time focused on DataOps or a similar operational role..
- Platform Expertise: Strong experience with data platforms such as Databricks and exposure to multiple cloud environments (Azure, AWS, or GCP)..
- Data Processing: Extensive experience with Apache Spark for large-scale data processing..
- Orchestration: Experience working with data orchestration tools like Azure Data Factory (ADF), Apache Airflow, or similar..
- CI/CD & Version Control: knowledge of version control (Git) and experience with CI/CD pipelines (GitLab CI/CD, GitHub Actions)..
- IaC: hands-on experience with Terraform..
- Programming: Programming skills in Python and advanced proficiency in SQL.
- Soft Skills: Strong stakeholder management, communication, and collaboration skills. The ability to articulate complex technical concepts to non-technical audiences is a must..
- Problem-Solving: Strong problem-solving skills with an ability to analyze technical challenges and their business impact..
- Data Modeling: Experience with data modeling tools and methodologies, specifically with dbt (data build tool)..
- AI & ML: Experience with AI-related technologies like Retrieval-Augmented Generation (RAG) and frameworks such as LangChain..
- Data Observability: Hands-on experience with data quality and observability tools such as Great Expectations, Monte Carlo, or Soda Core..
- Data Governance: Familiarity with data governance principles, compliance requirements, and data catalogs (e.g., Unity Catalog)..
- Streaming Technologies: Experience with stream processing technologies like Kafka or Flink..
- Containerization: Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes)..
- Open Source: Contributions to open-source projects or relevant certifications..
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy.".
Skills:
Power BI, Tableau, SQL
Job type:
Full-time
Salary:
negotiable
- Data Lifecycle Management: A Data PM oversees the entire lifecycle of data projects, from data acquisition, integration, and storage to analysis and visualization. This involves significant understanding of technical processes and data systems.
- Collaboration with Technical Teams: Work closely with engineers, data scientists, and IT teams to ensure data pipelines and infrastructures align with project goals. This requires a deep understanding of technical jargon, workflows, and dependencies.
- Monitoring and Reporting: Track project progress and provide regular updates and rep ...
- Mastery of Technical Tools and Platforms.
- BI Tools: Power BI, Tableau.
- Project & Code Collaboration: GitHub, JIRA, Confluence.
- Cloud Systems: AWS, Azure, Google Cloud for data solutions.
- The role often requires working knowledge of SQL, Python, or other languages to interpret project outcomes, test processes, and validate results.
- Technical Decision-Making Authority.
- System Architecture: A Data PM may decide how data systems should be architected or what infrastructure to adopt based on project requirements.
- Tool Selection: Selecting appropriate analytics tools, databases, or platforms for project success is a regular part of the role.
- Ensuring Data Integrity: Data governance, accuracy, and validation are all technical concerns within a Data PM s purview.
- Challenges of the Data PM Role.
- Translate business needs into data requirements.
- Collaborate meaningfully with technical teams on implementation.
- Ensure compliance with technical standards (e.g., data security, privacy).
- Bachelor's or master's degree in a relevant field, such as Data Science, Computer Science, Business Administration, or Supply Chain Management.
- 3-5 years of experience in Project Management or a similar role, preferably in Data or IT domains.
- Strong understanding of Retail, Wholesale, or Supply Chain processes.
- Proficient in project management tools like Jira, Trello, or Microsoft Project.
- Experience with Agile/Scrum methodologies.
- Familiarity with.
- Data Analytics: SQL, Python, or other languages to interpret project outcomes.
- Data Engineering: Data pipelines, ETL processes, data storage systems.
- Data Science: Algorithms, machine learning models, statistical analysis, A/B testing.
- Technical Roadblocks: Anticipating and resolving issues like system integration, latency, and scalability.
- Excellent communication and stakeholder management skills.
- Proficient in English communication.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy.".
Skills:
Automation, TypeScript, Javascript
Job type:
Full-time
Salary:
negotiable
- Architectural Excellence -Engage in deep architecture reviews and technology assessments to drive continuous improvement and innovation.
- Stakeholder Collaboration - Collaborate with multiple stakeholders to understand their requirements and support them on their cloud adoption and migration journey.
- Technical Excellence- Translate developer needs into technical requirements and optimize cloud architectures for high availability, scalability, and performance.
- Development Leadership- Lead a dedicated development team in designing, developing, and delivering cutting-edge business automation capabilities for our Cloud Centre of Excellence.
- Cloud Infrastructure Mastery- Design and implement scalable, secure, and cost-effective cloud infrastructure solutions using major cloud providers such as Azure and Huawei, tailored to project requirements.
- Technical Guidance- Provide valuable technical guidance and support to foster a collaborative and innovative work environment.
- Stakeholder Engagement- Interface with stakeholders, including developers, project managers, and business owners, to gather feedback and refine our self-service platform.
- If you meet below qualifications and are ready to take on a challenging role, we encourage you to apply..
- Proven Expertise: Minimum of 5 years of experience in consulting and client-side roles, or a combination of both.
- Hands-On Leadership: Demonstrate at least 5 years of hands-on development experience before.
- transitioning to an engineering management or architecture role.
- Language Proficiency: Possess current or previous experience in one or more programming languages such as TypeScript/JavaScript, Java, Python, or Kotlin, with a deep understanding of their pros and cons.
- Tech Ecosystem Acumen: Navigate the modern technology ecosystem with ease, encompassing cloud providers, commerce vendors, experience platforms, event brokers, data processing platforms, analytics, business intelligence, big data, and AI.
- Cloud & DevOps: Exhibit hands-on experience with cloud architecture (preferably Azure), DevOps, Site Reliability Engineering, and Quality Engineering best practices. Experience around different cloud pricing and strategies.
- Lifecycle Expertise: Embrace a broad understanding of the entire project lifecycle, from project inception to MVP scoping, agile development, and operational excellence.
- Agile Knowledge: Bring working knowledge of agile delivery and scaled agile methodologies, ideally on the 'architecture runway' side.
- Continuous Architecture: Contribute your experience with 'real-life' continuous architecture practices, preferably within an architecture guild/team in a large organization.
- 1
- 2
- 3
- 4
- 5
- 6
