- No elements found. Consider changing the search query.


ประสบการณ์:
3 ปีขึ้นไป
ทักษะ:
ETL, Apache, Python, English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Analyze and organize raw data.
- Combine raw information from different sources.
- Designing and building data models to support business requirements.
- Developing and maintaining data ingestion and processing systems.
- Implementing data storage solutions. (databases and data lakes).
- Ensuring data consistency and accuracy through data validation and cleansing techniques.
- Conduct complex data analysis and report on results.
- Explore ways to enhance data quality and reliability.
- Working together with cross-functional teams to identify and address data-related issues.
- Writes unit/integration tests, contributes to engineering wiki, and documents work.
- Bachelor or Master Degree in Computer Science, Software Engineering, Computer Engineering ICT, IT or any related technical field.
- 2-5 years of experience as a data engineer or in a similar role.
- Experience with schema design and dimensional data modeling.
- Experience and knowledge in Python development.
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Experience building and optimizing data pipelines, architectures and data sets.
- Experience designing, building, and maintaining data processing systems.
- Experience with orchestration tools e.g. batch and real-time data processing.
- Experience with CICD pipeline data.
- Experience with big data.
- Familiarity with data integration and ETL tools.
- Strong problem-solving and analytical skills.
- Able to speak Thai fluently and basic command in English.
ประสบการณ์:
5 ปีขึ้นไป
ทักษะ:
Python, ETL, Compliance
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Design and implement scalable, reliable, and efficient data pipelines for ingesting, processing, and storing large amounts of data from a variety of sources using cloud-based technologies, Python, and PySpark.
- Build and maintain data lakes, data warehouses, and other data storage and processing systems on the cloud.
- Write and maintain ETL/ELT jobs and data integration scripts to ensure smooth and accurate data flow.
- Implement data security and compliance measures to protect data privacy and ensure regulatory compliance.
- Collaborate with data scientists and analysts to understand their data needs and provide them with access to the required data.
- Stay up-to-date on the latest developments in cloud-based data engineering, particularly in the context of Azure, AWS and GCP, and proactively bring new ideas and technologies to the team.
- Monitor and optimize the performance of data pipelines and systems, identifying and resolving any issues or bottlenecks that may arise.
- Bachelor s or Master s degree in Computer Science, Data Science, or a related field.
- Minimum of 5 years of experience as a Data Engineer, with a strong focus on cloud-based data infrastructure.
- Proficient programming skills in Python, Java, or a similar language, with an emphasis on Python.
- Extensive experience with cloud-based data storage and processing technologies, particularly Azure, AWS and GCP.
- Familiarity with ETL/ELT tools and frameworks such as Apache Beam, Apache Spark, or Apache Flink.
- Knowledge of data modeling principles and experience working with SQL databases.
- Strong problem-solving skills and the ability to troubleshoot and resolve issues efficiently.
- Excellent communication and collaboration skills to work effectively with cross-functional teams.
- Location: True Digital Park, Bangkok (Hybrid working).
ทักษะ:
Cloud Computing, SAP, Linux
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Acting as the key of Cloud technical aspect for the consulting team to provide the technical consulting to both internal and external customers.
- Design Cloud solution architecture in response to the client s requirement.
- Provide advisory consulting service to the client regarding the True IDC Consulting practices.
- Create Cloud technical requirements to the client s migration plan.
- Experience of designing and implementing comprehensive Cloud computing solutions on various Cloud technologies e.g. AWS, GCP.
- Experience in building multi-tier Service Oriented Architecture (SOA) applications.
- Experience in SAP Cloud Infrastructure in term of architecture & design in AWS, GCP public cloud.
- Knowledge of Linux, Windows, Apache, IIS, NoSQL operations as its architecture toth e Cloud.
- Knowledge of Containerization administrative for both Windows and Linux technologies.
- Knowledge of key concerns and how they are addressed in Cloud Computing such as security, performance and scalability.
- Good in customer objective handling & Good in customer presentation skill.
- Nice to have.
- UNIX shell scripting.
- AWS Certified Solution Architect - Associate.
- GCP Certified Solution Architect - Associate.
ทักษะ:
DevOps, Automation, Kubernetes
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Managing 7-8 Professional Service Engineers in responsible for AWS cloud solution architecting and implementation/migration according to the project requirements.
- Team resources management.
- Acting as the key of Cloud technical aspect for the consulting team to provide the technical of AWS cloud consulting to customers.
- Design AWS Cloud solution architecture in response to the client s requirement.
- Define the scope of work & estimate mandays for cloud implementation.
- Managing cloud project delivery to meet the customer requirements timeline.
- Support AWS, GCP cloud partner competency building e.g. AWS Certification and delivery professional service process and documentation.
- Speaker of AWS technical side for True IDC webinar, online event for CloudTalk.
- Key Driving for building team competency expansion to meet the competency roadmap yearly strategy e.g. DevOps, IaC, Automation, Kubernetes, App modernization on AWS cloud.
- Experience in leading cloud AWS implementation and delivery team.
- Experience of designing and implementing comprehensive Cloud computing solutions on various Cloud technologies for AWS, GCP is plus.
- Experience in infra as a code in cloud native (Cloud Formation) or other e.g. Terraform, Ansible implementation.
- Experience in building multi-tier Service Oriented Architecture (SOA) applications.
- Knowledge of Linux, Windows, Apache, IIS, NoSQL operations as its architecture to the Cloud.
- Knowledge of OS administrative for both Windows and UNIX technologies.
- Knowledge of key concerns and how they are addressed in Cloud Computing such as security, performance and scalability.
- Knowledge of Kubernetes, Containers and CI/CD, DevOps.
- Experience with RDBMS designing and implementing over the Cloud.
- Prior experience with application development on the various development solutions as Java,.Net, Python etc.
- Experience in,.Net and/or Spring Framework and RESTful web services.
- UNIX shell scripting.
- AWS Certified Solution Architect - Associate, Prefer Professional level.
ประสบการณ์:
5 ปีขึ้นไป
ทักษะ:
Data Analysis, Automation, Python
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Work with stakeholders throughout the organization to understand data needs, identify issues or opportunities for leveraging company data to propose solutions for support decision making to drive business solutions.
- Adopting new technology, techniques, and methods such as machine learning or statistical techniques to produce new solutions to problems.
- Conducts advanced data analysis and create the appropriate algorithm to solve analytics problems.
- Improve scalability, stability, accuracy, speed, and efficiency of existing data model.
- Collaborate with internal team and partner to scale up development to production.
- Maintain and fine tune existing analytic model in order to ensure model accuracy.
- Support the enhancement and accuracy of predictive automation capabilities based on valuable internal and external data and on established objectives for Machine Learning competencies.
- Apply algorithms to generate accurate predictions and resolve dataset issues as they arise.
- Be Project manager for Data project and manager project scope, timeline, and budget.
- Manage relationships with stakeholders and coordinate work between different parties as well as providing regular update.
- Control / manage / govern Level 2 support, identify, fix and configuration related problems.
- Keep maintaining/up to date of data modelling and training model etc.
- Run through Data flow diagram for model development.
- EDUCATION.
- Bachelor's degree or higher in computer science, statistics, or operations research or related technical discipline.
- EXPERIENCE.
- At least 5 years experience in a statistical and/or data science role optimization, data visualization, pattern recognition, cluster analysis and segmentation analysis, Expertise in advanced Analytica l techniques such as descriptive statistical modelling and algorithms, machine learning algorithms, optimization, data visualization, pattern recognition, cluster analysis and segmentation analysis.
- Expertise in advanced analytical techniques such as descriptive statistical modelling and algorithms, machine learning algorithms, optimization, data visualization, pattern recognition, cluster analysis and segmentation analysis.
- Experience using analytical tools and languages such as Python, R, SAS, Java, C, C++, C#, Matlab, SPSS IBM, Tableau, Qlikview, Rapid Miner, Apache, Pig, Spotfire, Micro S, SAP HANA, Oracle, or SOL-like languages.
- Experience working with large data sets, simulation/optimization and distributed computing tools (e.g., Map/Reduce, Hadoop, Hive, Spark).
- Experience developing and deploying machine learning model in production environment.
- Knowledge in oil and gas business processes is preferrable.
- OTHER REQUIREMENTS.
ประสบการณ์:
6 ปีขึ้นไป
ทักษะ:
Big Data, Good Communication Skills, Scala
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Collate technical and functional requirements through workshops with senior stakeholders in risk, actuarial, pricing and product teams.
- Translate business requirements to technical solutions leveraging strong business acumen.
- Analyse current business practice, processes, and procedures as well as identifying future business opportunities for leveraging Data & Analytics solutions on various platforms.
- Develop solution proposals that provide details of project scope, approach, deliverables and project timeline.
- Provide architectural expertise to sales, project and other analytics teams.
- Identify risks, assumptions, and develop pricing estimates for the Data & Analytics solutions.
- Provide solution oversight to delivery architects and teams.
- Skills and attributes for success.
- 6-8 years of experience in Big Data, data warehouse, data analytics projects, and/or any Information Management related projects.
- Prior experience building large scale enterprise data architectures using commercial and/or open source Data Analytics technologies.
- Ability to estimate complexity, effort and cost.
- Ability to produce client ready solution architecture, business understandable presentations and good communication skills to lead and run workshops.
- Strong knowledge of data manipulation languages such as Spark, Scala, Impala, Hive SQL, Apache Nifi and Kafka necessary to build and maintain complex queries, streaming and real-time data pipelines.
- Data modelling and architecting skills including strong foundation in data warehousing concepts, data normalisation, and dimensional data modelling such as OLAP, or data vault.
- Good fundamentals around security integration including Kerberos authentication, SAML and data security and privacy such as data masking and tokenisation techniques.
- Good knowledge in DevOps engineering using Continuous Integration/ Delivery tools.
- An in depth understanding of Cloud solutions (AWS, Azure and/or GCP) and experienced in integrating into traditional hosting/delivery models.
- Ideally, you ll also have.
- Experience in engaging with both technical and non-technical stakeholders.
- Strong consulting experience and background, including engaging directly with clients.
- Demonstrable Cloud experience with Azure, AWS or GCP.
- Configuration and management of databases.
- Experience with big data tools such as Hadoop, Spark, Kafka.
- Experience with AWS and MS cloud services.
- Python, SQL, Java, C++, Scala.
- Highly motivated individuals with excellent problem-solving skills and the ability to prioritize shifting workloads in a rapidly changing industry. An effective communicator, you ll be a confident leader equipped with strong people management skills and a genuine passion to make things happen in a dynamic organization.
- What working at EY offers.
- Support, coaching and feedback from some of the most engaging colleagues around.
- Opportunities to develop new skills and progress your career.
- The freedom and flexibility to handle your role in a way that s right for you.
- about EY
- As a global leader in assurance, tax, transaction and advisory services, we hire and develop the most passionate people in their field to help build a better working world. This starts with a culture that believes in giving you the training, opportunities and creative freedom to make things better. So that whenever you join, however long you stay, the exceptional EY experience lasts a lifetime.
- If you can confidently demonstrate that you meet the criteria above, please contact us as soon as possible.
- Join us in building a better working world. Apply now!.
ทักษะ:
Statistics, Data Analysis, SQL
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Data Science Foundations: Strong foundation in data science, statistics, and advanced data analytics, including data visualization to communicate insights effectively.
- Exploratory Data Analysis (EDA): Skilled in performing EDA to uncover patterns, detect anomalies, and generate meaningful insights from data.
- Experimentation & Testing: Skilled in designing A/B tests or other experimental designs to measure business impact, analyze results, and communicate findings clearly to stakeholders.
- Machine Learning & AI.
- Model Development & Deployment: Experience in building, deploying, and optimizing machine learning models on large datasets.
- Generative AI (GenAI): Opportunity to work on GenAI projects that drive innovation and impactful business solutions.
- Problem-Solving & Collaboration.
- Analytical & Problem-Solving Skills: Strong analytical and problem-solving abilities focused on deriving actionable insights from data.
- Team Collaboration: Ability to work effectively both independently and as part of a collaborative team, contributing to shared project goals.
- Technical Expertise.
- Proficiency in Big Data Technologies: Expertise in Spark, PySpark, and SQL for large-scale data processing focused on feature creation for machine learning models and data analysis tasks.
- Programming Skills: Strong proficiency in Python for data analysis and machine learning (including libraries like Pandas, PySpark, Scikit-learn, XGBoost, LightGBM, Matplotlib, Plotly, Seaborn, etc.).
- Python Notebooks: Familiarity with Jupyter, Google Colab, or Apache Zeppelin for interactive data analysis and model development.
- Platform Experience: Experience in using PySpark on cloud platforms such as Azure Databricks or other platforms (including on-premise) is a plus.
- Education & Experience.
- Educational Background: Bachelor s or advanced degree in Data Science, Statistics, Computer Science, Computer Engineering, Mathematics, Information Technology, Engineering, or related fields.
- Work Experience: At least 2-3 years of relevant experience in Data Science, Analytics, or Machine Learning, with demonstrated technical expertise and a proven track record of driving data-driven business solutions.
ประสบการณ์:
2 ปีขึ้นไป
ทักษะ:
Oracle, Linux, Web Services, Database Administration, English
ประเภทงาน:
งานประจำ
เงินเดือน:
฿30,000 - ฿60,000, สามารถต่อรองได้
- บำรุงรักษาซอฟต์แวร์,ฮาร์ดแวร์และเน็ตเวิร์คและอุปกรณ์ต่างๆภายในบริษัทฯ รวมทั้งสนับสนุนงานให้ทีมงานสามารถใช้งานอุปกรณ์ได้อย่างต่อเนื่อง.
- ดูแลเรื่องการ Backup, Recovery และทำ Disaster Recovery Site ตามแผนงานของบริษัทฯ.
- วิเคราะห์และแก้ไขปัญหาทางด้านเทคนิค ทั้งทางด้านซอฟต์แวร์,ฮาร์ดแวร์และเน็ตเวิร์คให้ลุล่วงไปได้ด้วยดี.
- สนับสนุนงานทางด้านเทคนิค ทั้งซอฟต์แวร์,ฮาร์ดแวร์และเน็ตเวิร์ค ให้ทีมงานสามารถดำเนินงานของโครงการเป็นไปตามแผนงานที่กำหนด.
- ให้บริการและความช่วยเหลือลูกค้า เกี่ยวกับระบบงานของบริษัทฯ และฐานข้อมูล Oracle หรือปัญหาทางด้านเทคนิคต่างๆ ให้สามารถใช้งานได้อย่างต่อเนื่องที่บริษัทลูกค้า และบนระบบ Cloud.
- ติดตั้งโปรแกรม Oracle Product, PHP และระบบงานของบริษัทฯ ที่ทำการของลูกค้า หรือบนระบบ Cloud.
- ศึกษา ค้นคว้า และนำเอาเทคโนโลยีใหม่ๆ ทั้งทางด้านซอฟต์แวร์, ฮาร์ดแวร์, เน็ตเวิร์ค และ Oracle Product มาปรับใช้งานในระบบงานของบริษัทฯ.
- จบการศึกษาปริญญาตรีทางด้าน IT สาขา Computer Science, Computer Engineer หรือสาขาอื่นๆ ที่เกี่ยวข้อง.
- มีประสบการณ์ในการติดตั้งและแก้ไขปัญหาและมีความรู้ทางด้านฐานข้อมูล Oracle Database และสามารถใช้คำสั่ง SQL ได้อย่างชำนาญ ถ้ามีความสามารถเขียน PL/SQL จะได้รับพิจารณาเป็นพิเศษ.
- มีประสบการณ์ในการติดตั้งและแก้ไขปัญหา Linux เช่น Ubuntu, CentOS, Redhat, และ Microsoft Windows ทั้ง Server และ Workstation.
- มีประสบการณ์ในการติดตั้งและตั้งค่า Web Server(Apache,nginx), PHP ถ้ามีประสบการณ์ใช้งาน Docker จะได้รับพิจารณาเป็นพิเศษ.
- มีประสบการณ์ในการเขียน Shell Script บน Linux ได้อย่างชำนาญ.
- มีประสบการณ์และความรู้ในการวิเคราะห์ปัญหาและแก้ไขปัญหาทางด้านเทคนิคต่างๆ ของซอฟต์แวร์,ฮาร์ดแวร์และเน็ตเวิร์คได้เป็นอย่างดี.
- มีประสบการณ์และความรู้ทางด้าน Network, VMware, Active Directory, Web Server, DNS, DHCP, Anti-virus และโปรแกรมสำเร็จรูปอื่นๆ.
- ถ้ามีประสบการณ์ทางด้าน AWS จะพิจารณาเป็นพิเศษ.
- มีความสามารถในการเรียนรู้และศึกษาค้นคว้าเทคโนโลยีใหม่ๆ.
ประสบการณ์:
5 ปีขึ้นไป
ทักษะ:
Java, Spring Boot, Thai, English
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Responsible for the detail solution design of required solution aligning with Technology Solution Lead, Senior BA/BA and all Partner team and managing the deliverable of detail technical specification by respective team resources ensuring the quality of deliverable meet the expected requirement.
- Coordinate with BU/SU to gather requirement and design application architecture aligned with IT Blueprint as well as business needs and directions.
- Manage resources to provide the related service including adaptation on application ...
- Application Solution Delivery.
- Leading the deliverable of application solution with well design architecture and aligned with standard in IT Blueprint.
- Manage resources to apply the proper technology in developing valued added solution to serve business needs.
- Manage resources to deliver automated and fully integrated solution for end to end work process.
- Bachelor's degree or higher in Information Technology, Computer Science, or other related fields.
- At least 5 years of experience in core Java fundamentals, Java 8+, Spring, Spring Boot, and testing frameworks like JMeter.
- Experience in Application Architecture and System Integration using technologies such as Unix, Linux, Apache, JBoss, SQL databases, MQ, Redis.
- Hands-on experience with Next.js, React, Java Spring Boot, Bootstrap, and Tailwind.
- Familiarity with message queues (MQ) and Redis, along with experience using automation tools, Git control, and support tools.
- CI/CD implementation experience from scratch, using tools like GitHub, GitLab, Bitbucket, and Jenkins.
- Experience with Jboss, Openshift, Docker, and Firebase messaging services.
- Experience in network and security, including resolving firewall connection issues, addressing integration challenges, load balancing, and disaster recovery planning.
- Experience developing Unix Shell Scripting, SQL, Java, and Python from scratch.
- Experience in application and database design.
- Experience in Production Support Management, including Incident and Problem Management.
- Knowledge of banking products or Banking and the financial industry would be advantageous.
ทักษะ:
Compliance, Research, Automation
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- DataOps, MLOps, and AIOpsDesign, build, and optimize scalable, secure, and efficient data pipelines for AI/ML workflows.
- Automate data ingestion, transformation, and deployment across AWS, GCP, and Azure.
- Implement MLOps and AIOps for model versioning, monitoring, and automated retraining.
- Ensure performance, security, scalability, and cost efficiency in AI lifecycle management.
- Performance Optimization & SecurityMonitor, troubleshoot, and optimize AI/ML pipelines and data workflows to enhance reliability.
- Implement data governance policies, security best practices, and compliance standards.
- Collaborate with cybersecurity teams to address vulnerabilities and ensure data protection.
- Data Engineering & System IntegrationDevelop and manage real-time and batch data pipelines to support AI-driven applications.
- Enable seamless integration of AI/ML solutions with enterprise systems, APIs, and external platforms.
- Ensure data consistency, quality, and lineage tracking across the AI/ML ecosystem.
- AI/ML Model Deployment & OptimizationDeploy and manage AI/ML models in production, ensuring accuracy, scalability, and efficiency.
- Automate model retraining, performance monitoring, and drift detection for continuous improvement.
- Optimize AI workloads for resource efficiency and cost-effectiveness on cloud platforms.
- Continuous Learning & InnovationStay updated on AI/ML advancements, cloud technologies, and big data innovations.
- Contribute to proof-of-concept projects, AI process improvements, and best practices.
- Participate in internal research, knowledge-sharing, and AI governance discussions.
- Cross-Functional Collaboration & Business UnderstandingWork with business teams to ensure AI models align with organizational objectives.
- Gain a basic understanding of how AI/ML supports predictive analytics, demand forecasting, automation, personalization, and content generation.
- Bachelor s degree in Computer Science, Data Engineering, Information Technology, or a related field. Advanced degrees or relevant certifications (e.g., AWS Certified Data Analytics, Google Professional Data Engineer, Azure Data Engineer) are a plus.
- Experience:Minimum of 3-5 years experience in a data engineering or operations role, with a focus on DataOps, MLOps, or AIOps.
- Proven experience managing cloud platforms (AWS, GCP, and/or Azure) in a production environment.
- Hands-on experience with designing, operating, and optimizing data pipelines and AI/ML workflows.
- Technical Skills:Proficiency in scripting languages such as Python and Bash, along with experience using automation tools.
- Familiarity with containerization and orchestration technologies (e.g., Docker, Kubernetes) is desirable.
- Strong knowledge of data processing frameworks (e.g., Apache Spark) and data pipeline automation tools.
- Expertise in data warehouse solutions and emerging data lakehouse architectures.
- Experience with AWS technologies is a plus, especially AWS Redshift and AWS SageMaker, as well as similar tools on other cloud platforms.
- Understanding of machine learning model deployment and monitoring tools.
ทักษะ:
Compliance
ประเภทงาน:
งานประจำ
เงินเดือน:
สามารถต่อรองได้
- Ensure data availability, data integrity, and quality.
- Conduct regular system audits and generate reports on system performance and usage.
- Data Analysis and Reporting.
- Collect, analyze, and interpret data to provide actionable insights for business strategy.
- Develop and maintain dashboards, reports, and visualizations for various stakeholders.
- Support data-driven decision-making processes across the organization.
- Technical Support and Troubleshooting.
- Provide and collaborate with related teams on technical support for databases, visualization tools, and other systems, promptly resolving issues.
- Project Management.
- Work with cross-functional teams to gather requirements and develop project plans.
- Monitor project progress and adjust plans as necessary to meet objectives.
- System Development and Integration.
- Identify opportunities for system improvements and innovations.
- Design and implement system enhancements and integrations with other business applications.
- Ensure compliance with industry standards and regulatory requirements.
- Bachelor's or Master's Degree in MIS, IT, computer science, statistics, mathematics, business, or related field.
- Minimum of 5 years' experience in BI, dashboard, and data analysis roles.
- Experienced in the data analytics lifecycle, including problem identification, measurement/matrix, exploratory data analysis, and data insight presentation.
- Data Visualization (Microsoft Power BI, Tableau), Apache Superset is a plus.
- Strong creative and analytical problem-solving capabilities.
- Communication skills.
- Knowledge of database concepts and management.
- Excellent with MS Excel, SQL, Python, Airflow, and Pyspark is a plus.
- 1