- No elements found. Consider changing the search query.


Experience:
2 years required
Skills:
Big Data, Good Communication Skills, Scala
Job type:
Full-time
Salary:
negotiable
- You will be involved in all aspects of the project life cycle, including strategy, road-mapping, architecture, implementation and development.
- You will work with business and technical stakeholders to gather and analyse business requirements to convert them into the technical requirements, specifications, mapping documents.
- You will collaborate with technical teams, making sure the newly implemented solutions/technology are meeting business requirements.
- Outputs include workshop sessions and documentation including mapping documents.
- Develop solution proposals that provide details of project scope, approach, deliverables and project timeline.
- Skills and attributes for success.
- 2-4 years of experience in Big Data, data warehouse, data analytics projects, and/or any Information Management related projects.
- Prior experience building large scale enterprise data architectures using commercial and/or open source Data Analytics technologies.
- Ability to produce client ready solution, business understandable presentations and good communication skills to lead and run workshops.
- Strong knowledge of data manipulation languages such as Spark, Scala, Impala, Hive SQL, Apache Nifi and Kafka.
- Data modelling and architecting skills including strong foundation in data warehousing concepts, data normalisation, and dimensional data modelling such as OLAP, or data vault.
- Good knowledge in DevOps engineering using Continuous Integration/ Delivery tools.
- An in depth understanding of Cloud solutions (AWS, Azure and/or GCP) and experienced in integrating into traditional hosting/delivery models.
- Ideally, you ll also have.
- Experience in engaging with both technical and non-technical stakeholders.
- Strong consulting experience and background, including engaging directly with clients.
- Demonstrable Cloud experience with Azure, AWS or GCP.
- Configuration and management of databases.
- Experience with big data tools such as Hadoop, Spark, Kafka.
- Experience with AWS and MS cloud services.
- Python, SQL, Java, C++, Scala.
- Highly motivated individuals with excellent problem-solving skills and the ability to prioritize shifting workloads in a rapidly changing industry. An effective communicator, you ll be a confident leader equipped with strong people management skills and a genuine passion to make things happen in a dynamic organization.
- What working at EY offers.
- Support, coaching and feedback from some of the most engaging colleagues around.
- Opportunities to develop new skills and progress your career.
- The freedom and flexibility to handle your role in a way that s right for you.
- about EY
- As a global leader in assurance, tax, transaction and advisory services, we hire and develop the most passionate people in their field to help build a better working world. This starts with a culture that believes in giving you the training, opportunities and creative freedom to make things better. So that whenever you join, however long you stay, the exceptional EY experience lasts a lifetime.
- If you can confidently demonstrate that you meet the criteria above, please contact us as soon as possible.
- Join us in building a better working world. Apply now!.
Experience:
3 years required
Skills:
Kubernetes, Automation, Redis
Job type:
Full-time
Salary:
negotiable
- Platform Operations: Manage and operate our Kubernetes platform, ensuring high availability, performance, and security.
- Automation & Tooling: Design, develop, and implement automation solutions for operational tasks, infrastructure provisioning, and application deployment.
- Observability: Build and maintain a comprehensive observability stack (monitoring, logging,tracing) to proactively identify and resolve issues.
- Platform Stability & Performance: Implement and maintain proactive measures to ensure platform stability, performance optimization, and capacity planning.
- Middleware Expertise: Provide support and expertise for critical middleware tools such as RabbitMQ, Redis, and Kafka, ensuring their optimal performance and reliability.
- Incident Response: Participate in our on-call rotation, troubleshoot and resolve production incidents efficiently, and implement preventative measures.
- Collaboration: Collaborate effectively with development and other engineering teams.
- Positive attitude and empathy for others.
- Passion for developing and maintaining reliable, scalable infrastructure.
- A minimum of 3 years working experience in relevant areas.
- Experience in managing and operating Kubernetes in a production environment.
- Experienced with cloud platforms like AWS or GCP.
- Experienced with high availability, high-scale, and performance systems.
- Understanding of cloud-native architectures.
- Experienced with DevSecOps practices.
- Strong scripting and automation skills using languages like Python, Bash, or Go.
- Proven experience in building and maintaining CI/CD pipelines (e.g., Jenkins, GitLab CI).
- Deep understanding of monitoring, logging, and tracing tools and techniques.
- Experience with infrastructure-as-code tools (e.g., Terraform, Ansible).
- Strong understanding of Linux systems administration and networking concepts.
- Experience working with middleware technologies like RabbitMQ, Redis, and Kafka.
- Excellent problem-solving and troubleshooting skills.
- Excellent communication and collaboration skills.
- Strong interest and ability to learn any new technical topic.
- Experience with container security best practices.
- Experience with chaos engineering principles and practices.
- Experience in the Financial Services industry.
- Opportunity to tackle challenging projects in a dynamic environment.
Experience:
7 years required
Skills:
Java, Spring Boot, SQL
Job type:
Full-time
Salary:
negotiable
- Design, develop, test, deploy, document, and maintain software applications using Core Java, Spring Boot, Hibernate, Microservice framework, My SQL, Mongol DB, Docker, Kubernetes, CI/CD Pipelines, and Cloud Technologies.
- Provide technical support to end-users and ensure system availability and security at all times.
- Collaborate with cross-functional teams to identify and prioritize software requirements, ensuring alignment with business objectives.
- Troubleshoot and resolve software defects, working closely with developers and other stakeholders to implement solutions.
- Stay updated with emerging technologies and industry trends, integrating innovative approaches for sustained competitive advantage.
- Assist in the development of project plans, schedules, and budgets, tracking progress and ensuring timely delivery of projects.
- Ensure adherence to coding standards, best practices, and quality assurance processes throughout the software development lifecycle.
- Conduct code reviews and provide constructive feedback to improve team members coding skills and knowledge sharing within the team.
- Qualifications Bachelor's degree in Computer Science or related field (or equivalent experience).
- Strong proficiency in Core Java, Spring Boot, Hibernate, Microservice framework, My SQL, Mongol DB, Docker, Kubernetes, CI/CD Pipelines, and Cloud Technologies.
- Experience with front-end frameworks such as AngularJS or ReactJS.
- Solid understanding of web application security principles and best practices.
- Knowledge of relational database management systems (RDBMS) and SQL.
- Familiarity with version control systems such as Git.
- Excellent problem-solving and analytical skills, with attention to detail.
- Ability to work independently and collaboratively in a fast-paced environment.
- Effective communication and interpersonal skills.
- We're committed to bringing passion and customer focus to the business. If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us.
Experience:
6 years required
Skills:
Big Data, Good Communication Skills, Scala
Job type:
Full-time
Salary:
negotiable
- Collate technical and functional requirements through workshops with senior stakeholders in risk, actuarial, pricing and product teams.
- Translate business requirements to technical solutions leveraging strong business acumen.
- Analyse current business practice, processes, and procedures as well as identifying future business opportunities for leveraging Data & Analytics solutions on various platforms.
- Develop solution proposals that provide details of project scope, approach, deliverables and project timeline.
- Provide architectural expertise to sales, project and other analytics teams.
- Identify risks, assumptions, and develop pricing estimates for the Data & Analytics solutions.
- Provide solution oversight to delivery architects and teams.
- Skills and attributes for success.
- 6-8 years of experience in Big Data, data warehouse, data analytics projects, and/or any Information Management related projects.
- Prior experience building large scale enterprise data architectures using commercial and/or open source Data Analytics technologies.
- Ability to estimate complexity, effort and cost.
- Ability to produce client ready solution architecture, business understandable presentations and good communication skills to lead and run workshops.
- Strong knowledge of data manipulation languages such as Spark, Scala, Impala, Hive SQL, Apache Nifi and Kafka necessary to build and maintain complex queries, streaming and real-time data pipelines.
- Data modelling and architecting skills including strong foundation in data warehousing concepts, data normalisation, and dimensional data modelling such as OLAP, or data vault.
- Good fundamentals around security integration including Kerberos authentication, SAML and data security and privacy such as data masking and tokenisation techniques.
- Good knowledge in DevOps engineering using Continuous Integration/ Delivery tools.
- An in depth understanding of Cloud solutions (AWS, Azure and/or GCP) and experienced in integrating into traditional hosting/delivery models.
- Ideally, you ll also have.
- Experience in engaging with both technical and non-technical stakeholders.
- Strong consulting experience and background, including engaging directly with clients.
- Demonstrable Cloud experience with Azure, AWS or GCP.
- Configuration and management of databases.
- Experience with big data tools such as Hadoop, Spark, Kafka.
- Experience with AWS and MS cloud services.
- Python, SQL, Java, C++, Scala.
- Highly motivated individuals with excellent problem-solving skills and the ability to prioritize shifting workloads in a rapidly changing industry. An effective communicator, you ll be a confident leader equipped with strong people management skills and a genuine passion to make things happen in a dynamic organization.
- What working at EY offers.
- Support, coaching and feedback from some of the most engaging colleagues around.
- Opportunities to develop new skills and progress your career.
- The freedom and flexibility to handle your role in a way that s right for you.
- about EY
- As a global leader in assurance, tax, transaction and advisory services, we hire and develop the most passionate people in their field to help build a better working world. This starts with a culture that believes in giving you the training, opportunities and creative freedom to make things better. So that whenever you join, however long you stay, the exceptional EY experience lasts a lifetime.
- If you can confidently demonstrate that you meet the criteria above, please contact us as soon as possible.
- Join us in building a better working world. Apply now!.
Experience:
8 years required
Skills:
Full Stack, GraphQL, Scrum, English
Job type:
Full-time
Salary:
negotiable
- Maintain ownership and responsibility of mission critical systems.
- Be hands on - build high volume platforms using cutting-edge technologies like React and GraphQL.
- Mentor and coach other software engineers.
- Be a major contributor to our agile and scrum practices.
- Design and lead crucial technical projects and initiatives cross teams and departments.
- Stay on the leading edge of technical know-how, industry trends and drive technical innovations.
- 8 years of experience developing web applications in client-side frameworks such as React, Angular, VueJS, etc.
- B.S. in Computer Science or quantitative field; M.S. preferred.
- Working experience with agile, analytics, A/B testing and/or feature flags, Continuous Delivery, Trunk-based Development.
- Excellent HTML/CSS skills - you understand not only how to build the data, but how to make it look great too.
- Excellent understanding of object-oriented JavaScript, TypeScript.
- You love new technologies and approaches and want to use the best tools available. We want people who can help us continually evolve our stack.
- Great communication and coordination skills.
- Excellent analytical thinking and problem-solving skills.
- You have a good command of the English language.
- Knowledge in physical architecture at scale, building resilient, no single point of failures, highly available solutions.
- Knowledge in one or more of the following: NoSQL technologies (Cassandra, ScyllaDB, ElasticSearch, Redis, DynamoDB, etc), Queueing system experience (Kafka, RabbitMQ, SQS, Azure Service Bus, etc).
- Working Experience with Containers and Dockerization, also K8S is a plus.
- Knowledge and hands on experience in CI/CD solutions would be a plus.
- Strong experience in all aspects of client-side performance optimization,.
- Extremely proficient in modern coding and design practices. For example, Clean Code, SOLID principals, and TDD.
- Experience in multiple front-end platforms including iOS, Android, Web, and API services.
- Have worked on an app or internet company that is at scale with large numbers of users and transactions per second.
- Have experience in a data driven company with experience analyzing and working with Big Data.
- Lead teams and greenfield projects solving large system problems.
- Worked on global projects serving world markets with distributed data centers and localization of the front end and data.
- This position is based in Bangkok, Thailand (Relocation Provided).
- Equal Opportunity Employer.
- At Agoda, we pride ourselves on being a company represented by people of all different backgrounds and orientations. We prioritize attracting diverse talent and cultivating an inclusive environment that encourages collaboration and innovation. Employment at Agoda is based solely on a person's merit and qualifications. We are committed to providing equal employment opportunity regardless of sex, age, race, color, national origin, religion, marital status, pregnancy, sexual orientation, gender identity, disability, citizenship, veteran or military status, and other legally protected characteristics.
- We will keep your application on file so that we can consider you for future vacancies and you can always ask to have your details removed from the file. For more details please read our privacy policy.
- Disclaimer.
- We do not accept any terms or conditions, nor do we recognize any agency's representation of a candidate, from unsolicited third-party or agency submissions. If we receive unsolicited or speculative CVs, we reserve the right to contact and hire the candidate directly without any obligation to pay a recruitment fee.
Skills:
Python, ASP.NET, .NET
Job type:
Full-time
Salary:
negotiable
- กำกับดูแล Data Pipelines ที่มีการใช้งานจริง และควบคุมงานการสร้าง Data Pipeline ใหม่ รวมถึงการดำเนินการตาม SLA ที่วางแผนร่วมกันกับทีมที่เกี่ยวข้อง.
- ศึกษาเทคโนโลยีใหม่ๆ เพื่อเป็นปรับปรุง Data Pipeline ให้สามารถตอบสนองผู้ใช้งาน รวมถึงความต้องการของทางธุรกิจให้ดีขึ้น.
- ศึกษาทำความเข้าใจระบบต่างๆที่ต้องนำข้อมูลมาจัดให้อยู่ในรูปแบบมาตรฐานและเก็บรวมกันเป็น Single source of truth รวมถึงศึกษาความต้องการใช้ข้อมูลของผู้บริหารในหน่วยธุรกิจ (business unit) ร่วมวางแผนโครงการ, นำเสนองบประมาณและแนวทางดำเนินงาน, นำเสนอข้อมูลทางเล ...
- ออกแบบ และพัฒนา Data Pipeline.
- ออกแบบ Data Pipeline ภายใต้การกำกับดูแลของผู้จัดการงานออกแบบและพัฒนาระบบฐานข้อมูล.
- กำกับดูแล Data Architecture และพัฒนา Data Pipeline.
- กำกับดูแลรับข้อมูลซึ่งมีทั้งแบบ data streaming และ data batch รวมถึงนำส่งข้อมูลให้กับ BU ต่างๆ.
- กำกับดูแลการทดสอบ Data Pipeline ที่ได้ วางแผนการขึ้นระบบของซอฟต์แวร์ร่วมกับผู้ใช้.
- กำกับดูแลการจัดทำเอกสาร คู่มือ การใช้งานซอฟต์แวร์ที่ได้พัฒนาและระบบที่เกี่ยวข้อง เพื่อเป็นเอกสารอ้างอิงการใช้งาน และอ้างอิงในการพัฒนาต่อยอดของซอฟต์แวร์.
- ประสานงานกับหน่วยงานที่เกี่ยวข้องเพื่อแก้ไขปัญหา หรือพัฒนาระบบ.
- กำกับดูแล Data Architecture รวมถึงการพัฒนา Data Pipeline ของเจ้าหน้าที่การออกแบบและพัฒนาระบบ.
- หากมีประสบการณ์ Data Modeling และสามารถออกแบบ Data Architecture ได้ พัฒนาโครงสร้างข้อมูลให้สอดคล้องกับความต้องการของหน่วยธุรกิจ สามารถทำ Logical / Physical Data Model และเข้าใจหลักการจัดการข้อมูล จะพิจารณาเป็นพิเศษ.
- ปริญญาตรีสาขาคอมพิวเตอร์หรือสาขาที่เกี่ยวข้อง.
- มีประสบการณ์ทางด้าน Data Engineer/ Data Architect อย่างน้อย 3 ปี.
- มีความเข้าใจในงาน Data Engineer สามารถทำ Data Pipeline ทั้งในรูปแบบ Batch และ Stream.
- สามารถเขียนและใช้งาน Python, ASP.net C#, MSSQL, MySql.
- สามารถเขียนและใช้งาน Golang, Kafka, Debezium CDC, Apache Airflow, Apache Spark, MongoDB (ถ้ามี).
- มีทักษะการใช้งาน Database.
- มีทักษะในการใช้ SQL.
- มีทักษะในการใช้ Pivot Table.
- มีทักษะในการทำ Performance Tuning ของระบบฐานข้อมูล.
- มีทักษะ Critical thinking and Problem solving.
- ติดต่อสอบถาม.
- สำนักทรัพยากรบุคคล.
- บริษัท ไทยเบฟเวอเรจ จำกัด (มหาชน).
- อาคารเล้าเป้งง้วน 1 333 ถนน วิภาวดีรังสิต จอมพล เขตจตุจักร กรุงเทพมหานคร 10900.
Skills:
Java, Spring Boot, Kubernetes
Job type:
Full-time
Salary:
negotiable
- Work in an agile team to build/develop features and technologies across various aspects of the Java stack, primarily focused on Spring Boot and Spring Cloud/NetflixOSS.
- CI/CD deployments on a Kubernetes-based platform, both on-premises and on multi-cloud infrastructure. (AWS and GCP).
- Possess an understanding of cloud-native architectures and be familiar with implementations involving service discovery, circuit breakers, client-side load balancing, and other architectural patterns related to elastic infrastructure.
- Participate in, and help create a company culture that attracts, retains, and coaches other engineers. The primary deliverable of a senior engineer is more senior engineers.
- Conduct design and code reviews.
- Provide specific technical expertise to help drive innovation.
- Identify emerging technologies to create leading-edge banking products.
- Partnering with architects and platform engineers to build strategies for execution, drive and facilitate key decisions, and influence others, and lead change where appropriate.
- A positive, can-do attitude, who naturally expresses a high degree of empathy to others.
- Bachelor s Degree in Computer Science or equivalent work experience.
- Relevant work experience. Or 3+ years for senior position.
- Experience in building complex applications from scratch and decomposing monolithic applications into micro-services.
- Minimum of core Java 8, Spring Boot, Spring Cloud.
- Kubernetes (or Docker/ Mesos and equivalent).
- MySQL, PostgreSQL, EnterpriseDB, NoSQL (Cassandra, MongoDB).
- RabbitMQ, Kafka.
- AWS & GCP.
- API Gateway.
- Linux.
- CI/CD (Jenkins, Git).
- React.JS (Optional).
- Experience with distributed architectures, SOA, microservices, and Platform-as-a-service (PaaS).
- Experience with Agile and Test-Driven Development (TDD) methodologies.
- Experience with high availability, high-scale, and performance systems.
- Experience in Automation testing/ or Unit testing is a plus.
- Location: True Digital Park, Bangkok.
Skills:
node.js, Java, Spring Boot
Job type:
Full-time
Salary:
negotiable
- Develop and maintain robust backend services using Node.js, Nest.js, Java, Spring Boot, Camel, and cloud platforms like AWS and GCP.
- Design and build scalable, event-driven, and failure-tolerant systems. Advocate for and implement best practices in DevSecOps, test-driven development (TDD), and continuous delivery pipelines.
- Collaborate on diverse projects in domains such as Payment, Cart, Fulfillment, Search, and Recommendation.
- Vector Search: Working with vector similarity search to enhance relevance.
- ML Models (XGBoost, CNNs): Applying machine learning models for search relevance and personalization.
- LLMs & PEFT: Fine-tuning large language models using Parameter-Efficient Fine-Tuning (PEFT).
- (These skills are not mandatory but would be considered a strong plus.).
- 7+ years of experience in backend development, focusing on Node.js, Nest.js, Java, Spring Boot, Camel, and cloud platforms like AWS and GCP.
- Strong knowledge of PostgreSQL, Redis, distributed locking mechanisms, functional programming, design patterns, and advanced isolation levels.
- Hands-on experience with REST and GraphQL API development.
- Familiarity with Kafka, SQS, Kubernetes, and containerized application deployment.
- Practical experience with OLAP databases like BigQueryand Redshift, analytics tools such as Mixpanel and Amplitude, and AI platforms like SageMaker, MLflow, and Vertex AI.
- Knowledge of NLP, data structures like graphs, BK Trees, B+ Trees, and the Pub/Sub paradigm.
- Excellent communication, collaboration, and problem-solving skills with a growth-oriented mindset.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy.
Job type:
Full-time
Salary:
negotiable
- ปฏิบัติงานยัง บริษัท Infinitas by Krungthai
- We are seeking a Data Scientist specializing in credit decision engine development to drive data-driven lending decisions and risk assessment. The ideal candidate will combine strong analytical skills with deep understanding of credit process.
- Model Development & Deployment
- Develop and maintain credit risk assessment and lending decisions modules
- Expertise for handling large financial databases and credit data manipulation
- Design and implement credit-risk decisioning model solutions using API-based frameworks (e.g., Flask,
- Fast API) or event-driven architecture (e.g., Kafka, Pub/Sub), along with other suitable technologies
- Monitor model performance to ensure high accuracy and reliability in credit decisions Data Analysis & Risk Assessment
- Clean and preprocess financial datasets, particularly credit lending and risk data
- Conduct advanced statistical analyses to support risk assessment and lending decisions.
- Technical Skills
- Proficiency in Python, SQL, and machine learning libraries (TensorFlow, PyTorch, Scikit-Learn)
- Experience with cloud platforms (AWS,GCP) for model deployment
- Knowledge of statistical and machine learning techniques for risk modeling Domain Expertise
- Understanding of credit lending and risk assessment principles
- Experience in financial data analysis within regulatory constraints
- Proven track record in developing credit decision engines (optional)Education & Experience
- Bachelor s degree or higher in Statistics, Computer Science, Mathematics, or related field
- Minimum 3 years experience in retail lending or similar role Additional Requirements
- Strong communication skills for presenting complex findings and process flow to management
- Experience with data visualization tools (Tableau, Power BI)
- Ability to work collaboratively with cross-functional teams.
- You have read and reviewed Infinitas By Krungthai Company Limited's Privacy Policy at https://krungthai.com/Download/download/DownloadDownload_73Privacy_Policy_Infinitas.pdf. The Bank does not intend or require the processing of any sensitive personal data, including information related to religion and/or blood type, which may appear on copy of your identification card. Therefore, please refrain from uploading any documents, including copy(ies) of your identification card, or providing sensitive personal data or any other information that is unrelated or unnecessary for the purpose of applying for a position on the website. Additionally, please ensure that you have removed any sensitive personal data (if any) from your resume and other documents before uploading them to the website.
- The Bank is required to collect your criminal record information to assess employment eligibility, verify qualifications, or evaluate suitability for certain positions. Your consent to the collection, use, or disclosure of your criminal record information is necessary for entering into an agreement and being considered for the aforementioned purposes. If you do not consent to the collection, use, or disclosure of your criminal record information, or if you later withdraw such consent, the Bank may be unable to proceed with the stated purposes, potentially resulting in the loss of your employment opportunity with.".
Job type:
Full-time
Salary:
negotiable
- Collaborate with stakeholders, product owners, and business analysts to gather and interpret application requirements.
- Design scalable and efficient application architectures, workflows, and user interfaces aligned with business objectives.
- Develop wireframes, prototypes, and detailed design documentation to guide development teams.
- Work closely with developers and QA teams to ensure accurate implementation of design specifications.
- Ensure all designs meet usability, performance, accessibility, and security standards.
- Maintain consistency in design through the use of design systems, patterns, and reusable components.
- Continuously review and improve existing application designs based on user feedback and business needs.
- Stay updated on the latest design trends, tools, and technologies to drive innovation and enhance user experience.
- Have experience in application design, service design, or function design about 3 years or above.
- Have experience in programming about 2 years or above.
- Have experience in database design about 2 years or above.
- Used to work in role system analyst or similar..
- Familiar with Java, Kotlin, or Golang (1st priority). Or any other programming languages (2nd priority).
- Familiar with database like MySQL, Postgres, MariaDB, or MS SQL(1st priority). Or any other database (2nd priority).
- Familiar with Kafka, Pub/Sub, or any Message Queue will be benefit.
- Familiar with Redis or any in-memory DB will be benefit.
- Familiar with Docker or Microservices development will be benefit.
- Familiar with No SQL like MongoDB or Elasticsearch will be benefit.
- Familiar with Cloud technology will be benefit.
- Intermediate - advance for using SQL.
- Have knowledge about software architecture (any pattern) will be benefit..
- Presentation and communication.
- Analysis.
- Systematic thinking.
Skills:
Research, Automation, Statistics
Job type:
Full-time
Salary:
negotiable
- Work on Data Architecture. They use a systematic approach to plan, create, and maintain data architectures while also keeping it aligned with business requirements.
- Collect Data. Before initiating any work on the database, they have to obtain data from the right sources. After formulating a set of dataset processes, data engineers store optimized data.
- Conduct Research. Data engineers conduct research in the industry to address any issues that can arise while tackling a business problem.
- Automate Tasks. Data engineers dive into data and pinpoint tasks where manual participation can be eliminated with automation.
- Automate Tasks. Data engineers dive into data and pinpoint tasks where manual participation can be eliminated with automation.
- Bachelor Degree in IT, computer science, statistics, mathematics, business, or related field.
- Minimum of 3 years' experience in data engineer roles.
- Experience in the data analytics lifecycle including problem identification, measurement/matrix, exploratory data analysis and data insight presentation.
- Experience with data tools and languages like CLOUD, Python, Java similar.
- Experience with data processing, ETL and workflow and messaging queue like Kafka.
- Data Warehousing.
- Data Structure.
- ETL Tools Programming Languages (Python, Java, Pyspark).
Skills:
Java, AJAX, Software Development
Job type:
Full-time
Salary:
negotiable
- Design and Develop applications based on web application by using programming language such as Java, JavaScript, object oriented design concept and database stored procedure. Able to develop application by using JavaScript, ajax or bootstrap development tool..
- Design test cases, test scenario and generate test script to unit test, functional test the overall programs, execute test, analyze and conclude the result in test report.
- Understand the requirement and detailed design to develop application which meet requirement target. Able to clarify requirement and match requirement to programming algorithm to development.
- Analysis impact assessment with external systems which interface to corporate business applications.
- Support and solve problems of corporate business applications by analyzing production errors, finding root cause and find out interim and long-term solution after project launch..
- Bachelor's degree in Engineer, Computer Science or IT related field.
- At least 4-5 year experiences in software development.
- Computer Language - web application, Java, J2EE, JSP, Java Servlet, Spring, Hibernate, EJB, Strut, JavaScript, PL/SQL, Ajax or bootstrap development tool..
- Well knowlege with new application technology - Container, Docker, Cloud platform, CI/CD on cloud platform.
- Secure coding that comply PCI-DSS standard.
- OS & Database - Oracle, Unix, Linux, MySQL, SQL Command/Server, SQL Tunning..
- Design and Implement Customer Data Tracking System
- Define tracking mechanisms across multiple customer touchpoints, including Mobile Applications, Websites, Call Centers, and Physical Stores.
- Develop and standardize Event Schema for tracking customer interactions across different platforms.
- Structure data to facilitate behavioral analysis and personalization strategies..
- Backend Tracking System Design & Implementation
- Architect the backend tracking system using event-driven architecture via Service Bus or Event Streaming Platforms (e.g., Kafka, RabbitMQ, Azure Service Bus).
- Collaborate with DevOps, Backend, and Data Engineering teams to ensure accurate data collection and usability..
- Develop SDKs/Libraries for Tracking
- Develop SDKs or JavaScript Libraries for easy integration into Mobile and Web applications.
- Establish guidelines and provide technical consultation to development teams for proper tracking implementation..
- Consultation and Data Tracking Validation
- Assist Product, Marketing, and Data Science teams in defining Event Tracking Values to ensure meaningful data collection.
- Validate tracking accuracy and help optimize tracking implementation..
- Design Personalized Offering & Recommendation Systems
- Work closely with Data Science teams to design and implement a Personalized Offering System based on customer behavior.
- Define strategies for personalized product recommendations and service offerings.
- Bachelor's degree in Engineer, Computer Science or IT related field.
- Proficiency in mobile (Java, Kotlin, Swift, React Native, Flutter) and web (JavaScript, Python, HTML, CSS) languages, Understanding Frameworks (React, Angular, Vue.js, NestJS), Mobile/web development lifecycle knowledge, Database (SQL, NoSQL) and API expertise, Version control (Git), Cloud Platform knowledge. and Testing and Debugging.
- Requirements gathering/analysis, System design. Problem-solving, Communication, Basic Project, Management and UI/UX understanding.
- 1