- No elements found. Consider changing the search query.


Skills:
ETL, Compliance, SQL
Job type:
Full-time
Salary:
negotiable
- Design, develop, and maintain data pipelines to extract, transform, and load (ETL) data from various sources into a centralized data warehouse or data lake.
- Integrate data from different sources, such as databases, APIs, and third-party applications, ensuring data consistency and accuracy.
- Create and maintain data models and schemas to facilitate data storage and retrieval, following best practices for data warehousing and database management.
- Implement data quality checks and validation processes to ensure data accuracy, completeness, and consistency.
- Optimize data pipelines and systems for performance, scalability, and efficiency, making sure data processing meets business requirements.
- Implement data security measures to protect sensitive information and ensure compliance with data privacy regulations (e.g., GDPR, HIPAA).
- Document data engineering processes, data lineage, and system architecture to facilitate knowledge sharing and future maintenance.
- Set up monitoring and alerting systems to detect and address issues with data pipelines and systems proactively.
- Collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to understand their data needs and provide the necessary infrastructure and support.
- Stay up-to-date with the latest data engineering technologies and tools, and evaluate their applicability to the organization's data stack.
- Bachelor's degree in computer science, information technology, or a related field. A master's degree is a plus.
- Strong proficiency in data engineering tools and technologies, such as SQL, ETL frameworks (e.g., Apache Spark, Apache Airflow, Apache Beam).
- Experience with Google BigQuery for data warehousing.
- Experience with Google Cloud Dataflow or Dataproc.
- Experience with programming languages like Python, Java, or Scala.
- Experience with building streaming data pipelines on Google Cloud.
- Knowledge of database design, data modeling, and data integration techniques.
- Familiarity with data governance, data security, and compliance standards.
- Problem-solving skills and attention to detail.
- Strong communication and collaboration skills.
Skills:
Data Analysis, ETL, Data Warehousing
Job type:
Full-time
Salary:
negotiable
- Data Architecture: Design, develop, and maintain the overall data architecture and data pipeline systems to ensure efficient data flow and accessibility for analytical purposes.
- Data Integration: Integrate data from multiple sources, including point-of-sale systems, customer databases, e-commerce platforms, supply chain systems, and other relevant data sources, ensuring data quality and consistency.
- Data Modeling: Design and implement data models that are optimized for scalability, ...
- Data Transformation and ETL: Develop and maintain efficient Extract, Transform, and Load (ETL) processes to transform raw data into a structured format suitable for analysis and reporting.
- Data Warehousing: Build and maintain data warehouses or data marts that enable efficient storage and retrieval of structured and unstructured data for reporting and analytics purposes.
- Data Governance and Security: Establish and enforce data governance policies and procedures, including data privacy and security measures, to ensure compliance with industry regulations and protect sensitive data.
- Data Quality and Monitoring: Implement data quality checks and monitoring mechanisms to identify and resolve data inconsistencies, anomalies, and issues in a timely manner.
- Collaboration: Collaborate with cross-functional teams, including data scientists, business analysts, and software engineers, to understand their data needs, provide data solutions, and support their analytical initiatives.
- Performance Optimization: Optimize data processing and query performance to ensure efficient data retrieval and analysis, considering factors such as data volume, velocity, and variety.
- Documentation: Maintain documentation of data processes, data flows, data models, and system configurations, ensuring accuracy and accessibility for future reference.
- Bachelor's or master's degree in computer science, information systems, or a related field.
- Strong programming skills in languages such as Python, SQL. C++ is plus.
- At least 5 year experience with data modeling, database design, and data warehousing concepts.
- Proficiency in working with relational databases (e.g., MySQL, PostgreSQL) and big data technologies (e.g., Hadoop, Spark, Hive).
- Familiarity with cloud-based data platforms, such as AWS.
- Knowledge of ETL tools and techniques for data integration and transformation.
- Understanding of data governance, data security, and regulatory compliance requirements.
- Excellent problem-solving skills and attention to detail.
- Strong communication and interpersonal skills to collaborate effectively with cross-functional teams.
- Ability to work in a fast-paced environment and handle multiple projects simultaneously.
- Location: BTS Ekkamai
- Working Day: Mon-Fri.
Skills:
Big Data, ETL, SQL
Job type:
Full-time
Salary:
negotiable
- Develop and maintain robust data pipelines to ingest, process, and transform raw data into formats suitable for LLM training.
- Conduct meeting with users to understand the data requirements and perform database design based on data understanding and requirements with consideration for performance.
- Maintain data dictionary, relationship and its interpretation.
- Analyze problem and find resolution, as well as work closely with administrators to monitor performance and advise any necessary infrastructure changes.
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Develop big data solutions for batch processing and near real-time streaming.
- Own end-to-end data ETL/ELT process framework from Data Source to Data warehouse.
- Select and integrate appropriate tools and frameworks required to provide requested capabilities.
- Design and develop BI solutions.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Keep abreast of new developments in the big data ecosystem and learn new technologies.
- Ability to effectively work independently and handle multiple priorities.
- Bachelor degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- 3+ year's experiences in Data Management or Data Engineer (Retail or E-Commerce business is preferrable).
- Expert experience in query language (SQL), Databrick SQL, PostgreSQL.
- Experience in Big Data Technologies like Hadoop, Apache Spark, Databrick.
- Experience in Python is a must.
- Experience in Generative AI is a must.
- Knowledge in machine/statistical learning, data mining is a plus.
- Strong analytical, problem solving, communication and interpersonal skills.
- Having good attitude toward team working and willing to work hard.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
Skills:
ETL, Big Data, SQL
Job type:
Full-time
Salary:
negotiable
- Design and develop ETL solutions using data integration tools for interfacing between source application and the Enterprise Data Warehouse.
- Experience with Big Data or data warehouse.
- Analyze & translate functional specifications & change requests into technical specifications.
- Experience in SQL programming in one of these RDBMSs such as Oracle.
- Develops ETL technical specifications, designs, develops, tests, implements, and supports optimal data solutions.
- Develops Documents ETL data mappings, data dictionaries, processes, programs and solutions as per established standards for data governance.
- Design and create codes for all related data extraction, transformation and loading (ETL) into database under responsibilities.
- Creates, executes, and documents unit test plans for ETL and data integration processes and programs.
- Perform problem assessment, resolution and documentation in existing ETL packages, mapping and workflows in production.
- Performance tuning of the ETL process and SQL queries and recommend and implement ETL and query tuning techniques.
- Qualifications Bachelor s degree in Information Technology, Computer Science, Computer Engineering, or a related field.
- Experience with CI/CD tools (e.g., Jenkins, GitLab CI/CD) and automation frameworks.
- Proficiency in cloud platforms such as AWS, Azure, and associated services.
- Knowledge of IAC tools like Terraform or Azure ARM.
- Familiarity with monitoring and logging tools like Prometheus, Grafana, ELK stack, APM, etc.
- Good understanding of IT Operations.
- Strong problem-solving skills and the ability to troubleshoot complex issues.
- Excellent communication and teamwork skills to collaborate effectively across various teams.
- We're committed to bringing passion and customer focus to the business. If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us.
Skills:
ETL, Big Data, SQL
Job type:
Full-time
Salary:
negotiable
- Design and develop ETL solutions using data integration tools for interfacing between source application and the Enterprise Data Warehouse.
- Experience with Big Data or data warehouse.
- Analyze & translate functional specifications & change requests into technical specifications.
- Experience in SQL programming in one of these RDBMSs such as Oracle.
- Develops ETL technical specifications, designs, develops, tests, implements, and supports optimal data solutions.
- Develops Documents ETL data mappings, data dictionaries, processes, programs and solutions as per established standards for data governance.
- Design and create codes for all related data extraction, transformation and loading (ETL) into database under responsibilities.
- Creates, executes, and documents unit test plans for ETL and data integration processes and programs.
- Perform problem assessment, resolution and documentation in existing ETL packages, mapping and workflows in production.
- Performance tuning of the ETL process and SQL queries and recommend and implement ETL and query tuning techniques.
- Qualifications Bachelor s degree in Information Technology, Computer Science, Computer Engineering, or a related field.
- Experience with CI/CD tools (e.g., Jenkins, GitLab CI/CD) and automation frameworks.
- Proficiency in cloud platforms such as AWS, Azure, and associated services.
- Knowledge of IAC tools like Terraform or Azure ARM.
- Familiarity with monitoring and logging tools like Prometheus, Grafana, ELK stack, APM, etc.
- Good understanding of IT Operations.
- Strong problem-solving skills and the ability to troubleshoot complex issues.
- Excellent communication and teamwork skills to collaborate effectively across various teams.
- We're committed to bringing passion and customer focus to the business. If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us.
Skills:
ETL, Compliance, SSIS, English
Job type:
Full-time
Salary:
negotiable
- Design and build scalable, high-performance data pipelines and ETL processes to support seamless data integration across systems.
- Partner closely with data analysts, data scientists, and business stakeholders to understand their needs and transform them into efficient technical solutions.
- Develop and manage robust data models, ensuring consistent data quality, integrity, and accessibility.
- Optimize data processing pipelines, troubleshoot performance issues, and implement improvements to enhance efficiency.
- Document workflows, system configurations, and processes to maintain compliance and facilitate future enhancements.
- Stay ahead of the curve by keeping up-to-date with the latest advancements in data engineering, cloud platforms, and analytics tools.
- ABOUT YOU.
- Demonstrated expertise as a Data Engineer, with a strong background in designing and maintaining ETL pipelines using tools like Databricks and SSIS.
- Proficient in SQL, with a deep understanding of data warehousing concepts and best practices.
- Familiarity with cloud platforms such as Azure or AWS is highly desirable.
- Strong analytical and problem-solving skills, coupled with an exceptional attention to detail.
- Excellent communication and collaboration abilities, with fluency in English and a proven track record of working effectively with cross-functional teams.
- WHY AMARIS?.
- A global community of talented professionals, fostering innovation and collaboration.
- An empowering environment that prioritizes trust and offers ample opportunities for growth and development.
- Access to world-class training programs and resources to advance your career.
- A workplace culture centered around teamwork, inclusivity, and a commitment to corporate social responsibility.
- Amaris Consulting is proud to be an equal-opportunity workplace. We are committed to promoting diversity within the workforce and creating an inclusive working environment. For this purpose, we welcome applications from all qualified candidates regardless of gender, sexual orientation, race, ethnicity, beliefs, age, marital status, disability, or other characteristics.
Skills:
Data Analysis, SQL, Problem Solving, English
Job type:
Full-time
Salary:
negotiable
- Working closely with business and technical domain experts to identify data requirements that are relevant for analytics and business intelligence.
- Implement data solutions and data comprehensiveness for data customers.
- Working closely with engineering to ensure data service solutions are ultimately delivered in a timely and cost effective manner.
- Retrieve and prepare data (automated if possible) to support business data analysis.
- Ensure adherence to the highest standards in data privacy protection and data governance.
- Bachelor s of Master s Degree in Computer Science, Computer Engineering, or related.
- Minimum of 1 Year with relational/non-relational database systems and good command in SQL.
- Ability to meet critical deadlines and prioritize multiple tasks in a fast-paced environment.
- Ability to work independently, have strong problem solving and organization skills, with a high initiative and a sense of accountability and ownership.
- Experience with cloud-based platforms such as AWS, Google Cloud platform or similar.
- Experience in data processing batch / real time / nearly realtime.
- Experience with data integration or ETL management tools such as AWS Glue, Databrick,or similar.
- Experience with web or software development with Java,Python or similar.
- Experience with Agile methodology is a plus.
- Good in communication and writing in English.
- Good interpersonal and communication skills.
Experience:
5 years required
Skills:
Python, ETL, Compliance
Job type:
Full-time
Salary:
negotiable
- Design and implement scalable, reliable, and efficient data pipelines for ingesting, processing, and storing large amounts of data from a variety of sources using cloud-based technologies, Python, and PySpark.
- Build and maintain data lakes, data warehouses, and other data storage and processing systems on the cloud.
- Write and maintain ETL/ELT jobs and data integration scripts to ensure smooth and accurate data flow.
- Implement data security and compliance measures to protect data privacy and ensure regulatory compliance.
- Collaborate with data scientists and analysts to understand their data needs and provide them with access to the required data.
- Stay up-to-date on the latest developments in cloud-based data engineering, particularly in the context of Azure, AWS and GCP, and proactively bring new ideas and technologies to the team.
- Monitor and optimize the performance of data pipelines and systems, identifying and resolving any issues or bottlenecks that may arise.
- Bachelor s or Master s degree in Computer Science, Data Science, or a related field.
- Minimum of 5 years of experience as a Data Engineer, with a strong focus on cloud-based data infrastructure.
- Proficient programming skills in Python, Java, or a similar language, with an emphasis on Python.
- Extensive experience with cloud-based data storage and processing technologies, particularly Azure, AWS and GCP.
- Familiarity with ETL/ELT tools and frameworks such as Apache Beam, Apache Spark, or Apache Flink.
- Knowledge of data modeling principles and experience working with SQL databases.
- Strong problem-solving skills and the ability to troubleshoot and resolve issues efficiently.
- Excellent communication and collaboration skills to work effectively with cross-functional teams.
- Location: True Digital Park, Bangkok (Hybrid working).
Skills:
SQL, Data Warehousing, ETL, English
Job type:
Full-time
Salary:
negotiable
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Retrieve, prepare, and process a rich data variety of data sources.
- Apply data quality, cleaning and semantic inference techniques to maintain high data quality.
- Explore data sources to better understand the availability and quality/integrity of data.
- Gain fluency in AI/ML techniques.
- Experience with relational database systems with expertise in SQL.
- Experience in data management, data warehousing or unstructured data environments.
- Experience with data integration or ETL management tools such as Talend, Apache Airflow, AWS Glue, Google DataFlow or similar.
- Experience programming in Python, Java or other equivalent is a plus.
- Experience with Business Intelligence tools and platforms is a plus, e.g. Tableau, QlikView, Google DataStudio, Google Analytics or similar.
- Experience with Agile methodology and Extreme Programming is a plus.
- Ability to meet critical deadlines and prioritize multiple tasks in a fast-paced environment.
- Ability to work independently, have strong problem solving and organization skills, with a high initiative and a sense of accountability and ownership.
- Good communication in English.
Experience:
5 years required
Skills:
ETL, Quantitative Analysis, Industry trends
Job type:
Full-time
Salary:
negotiable
- Translating business requirements to technical solutions leveraging strong business acumen.
- You will be a core member of the EY Microsoft Data and AI team, responsible for extracting large quantities of data from client s IT systems, developing efficient ETL and data management processes, and building architectures for rapid ingestion and dissemination of key data.
- Apply expertise in quantitative analysis, data mining and presentation of data to de ...
- Extremely flexible and experience managing multiple tasks and priorities on deadlines.
- Applying technical knowledge to architect solutions that meet business and IT needs, create Data Platform roadmaps, and enable the Data Platforms to scale to support additional use cases.
- Staying abreast of current business and industry trends relevant to the client's business.
- Monitoring progress, managing risk, and ensuring key stakeholders are kept informed about progress and expected outcomes.
- Understanding customers overall data estate, IT and business priorities and success measures to design implementation architectures and solutions.
- Strong team collaboration and experience working with remote teams.
- Working on large-scale client engagements. Fostering relationships with client personnel at appropriate levels. Consistently delivering quality client services. Driving high-quality work products within expected timeframes and on budget.
- Demonstrated significant professional experience of commercial, strategy and/or research/analytics interacting with senior stakeholders to effectively communicate insights.
- Execute on building data solutions for business intelligence and assist in effectively managing and monitoring the data ecosystem of analytics, data lakes, warehouses platforms and tools.
- Provide directional guidance and recommendations around data flows including data technology, data integrations, data models, and data storage formats.
- To qualify for the role, you must have.
- Bachelor s degree, or MS degree in Business, Economics, Technology Entrepreneurship, Computer Science, Informatics, Statistics, Applied Mathematics, Data Science, or Machine Learning.
- Minimum of 3-5 years of relevant consulting experience with focus on advanced analytics and business intelligence or similar roles. New graduated are welcome!.
- Communication and critical thinking are essential, must be able to listen and understand the question and develop and deliver clear insights.
- Experience communicating the results of analysis to both technical and non-technical audiences.
- Independent and able to manage and prioritize workload.
- Ability to adapt quickly and positively to change.
- Breadth of technical passion, desire to learn and knowledge services.
- Willingness and ability to travel to meet client if need.
- Ideally, you ll also have.
- Experience working business or IT transformation projects that have supported data science, business intelligence, artificial intelligence, and cloud applications at scale.
- Ability to communicate clearly and succinctly, adjusts to a variety of styles and audiences with ability to tell compelling stories with the data.
- Experience with C#, VBA, JavaScript, R.
- A vast understanding of key BI trends and the BI vendor landscape.
- Working experience with Agile and/or Scrum methods of delivery.
- Working experience with design led thinking.
- Microsoft Certifications in the Data & AI domain.
- We re interested in passionate leaders with strong vision and a desire to deeply understand the trends at the intersection of business and Data and AI. We want a customer-focused professional who is motivated to drive the creation of great enterprise products and who can collaborate and partner with other product teams, and engineers. If you have a genuine passion for helping businesses achieve the full potential of their data, this role is for you.
- What we offer.
- We offer a competitive compensation package where you ll be rewarded based on your performance and recognized for the value you bring to our business. In addition, our Total Rewards package includes medical and dental coverage, pension and 401(k) plans, and a wide range of paid time off options. Under our flexible vacation policy, you ll decide how much vacation time you need based on your own personal circumstances. You ll also be granted time off for designated EY Paid Holidays, Winter/Summer breaks, Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being.
- Continuous learning: You ll develop the mindset and skills to navigate whatever comes next.
- Success as defined by you: We ll provide the tools and flexibility, so you can make a meaningful impact, your way.
- Transformative leadership: We ll give you the insights, coaching and confidence to be the leader the world needs.
- Diverse and inclusive culture: You ll be embraced for who you are and empowered to use your voice to help others find theirs.
- If you can demonstrate that you meet the criteria above, please contact us as soon as possible.
- The exceptional EY experience. It s yours to build.
- EY | Building a better working world.
- EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.
- Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate.
- Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
- EY is an equal opportunity, affirmative action employer providing equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, national origin, protected veteran status, disability status, or any other legally protected basis, in accordance with applicable law.
Experience:
4 years required
Skills:
Statistics, Finance, Accounting
Job type:
Full-time
Salary:
negotiable
- Preparation and delivery of assignments. Provide services to a wide variety of clients, gaining an understanding of their unique businesses and needs. Work with clients in the banking sector and financial institutions.
- Your role as a leader:At Deloitte, we believe in the importance of empowering our people to be leaders at all levels. We connect our purpose and shared values to identify issues as well as to make an impact that matters to our clients, people and the communities. Additionally, Senior Associates / Senior Consultants / Assistant Manager ...
- Respect the needs of their colleagues and build up cooperative relationships.
- Understand the goals of our internal and external stakeholder to set personal priorities as well as align their teams work to achieve the objectives.
- Constantly challenge themselves, collaborate with others to deliver on tasks and take accountability for the results.
- Build productive relationships and communicate effectively in order to positively influence teams and other stakeholders.
- Offer insights based on a solid understanding of what makes Deloitte successful.
- Project integrity and confidence while motivating others through team collaboration as well as recognising individual strengths, differences, and contributions.
- Understand disruptive trends and promote potential opportunities for improvement.
- You are someone with:Bachelor s degree in Computer Science, Engineer, Statistics, Finance, Accounting or related field with good academic record.
- You should have 3 to 4 years of professional experience acquired in one or more of the following areas:Business analyst of the team developing financial software for banks or any NBFIs.
- Gather the banks business requirements about adopting new Financial Reporting Standards for Financial Instruments (IFRS9).
- Assist with data mapping process, analyse data requirements, and design data mapping solutions.
- Assist project manager with planning and controlling of the developing process.
- Design the calculation, accounting and financial reports of the financial transactions of the banks or any NBFIs.
- Responsible for developing and performing test scenarios, unit testing, system integration test (SIT) and User Acceptance test (UAT).
- Designed report and dashboard to meet client s and Bank of Thailand (BOT) requirements.
- Prepared and delivered user documentation including Functional specification document, User manual document, Data mapping document and Change requirement document.
- Investigate issues occurring in the system and provide solutions.
- Experiences in consulting firm is preferred.
- Knowledge on SQL Programming on SQL Server (T-SQL).
- Basic knowledge on ETL.
- Due to volume of applications, we regret only shortlisted candidates will be notified.
- Please note that Deloitte will never reach out to you directly via messaging platforms to offer you employment opportunities or request for money or your personal information. Kindly apply for roles that you are interested in via this official Deloitte website.Requisition ID: 107576In Thailand, the services are provided by Deloitte Touche Tohmatsu Jaiyos Co., Ltd. and other related entities in Thailand ("Deloitte in Thailand"), which are affiliates of Deloitte Southeast Asia Ltd. Deloitte Southeast Asia Ltd is a member firm of Deloitte Touche Tohmatsu Limited. Deloitte in Thailand, which is within the Deloitte Network, is the entity that is providing this Website.
Experience:
6 years required
Skills:
Finance, Data Analysis, Excel
Job type:
Full-time
Salary:
negotiable
- Work closely with a diverse set of stakeholders in Finance and other parts of the organization.
- Consult stakeholders to propose the best suited data solution.
- Build end-to-end solutions including data workflows, dashboards, reports etc.
- Become familiar with financial data ecosystem at Agoda.
- Help the Finance Analytics team drive value for the Finance department and Agoda as a whole.
- Undergraduate degree.
- 6+ years of leadership experience in analytics/data science/insights/strategy.
- 3+ years' experience leading analytics, operational, product or other technical teams.
- Expert domain of data analysis and data visualization tools and software such as Excel, SQL, Tableau, Python, R, or similar.
- Experience in ETL tools, data modelling and have proficient knowledge of SQL and Relational Databases: the ability to write, execute and interpret queries is essential.
- Quick learner, problem-solving aptitude, effective prioritization, proactive and strong attention to detail.
- High sense of ownership and growth mindset, ability to be self-directed.
- Ability to understand business questions/requests and be able to suggest proper BI solutions which are measurable and scalable.
- Excellent communication skills and ability to influence peers and build strong relationships within Finance and cross-functionally.
- Experience in articulating strategic issues and negotiating with C-level executives - experience in leading strategy consulting projects a plus.
- People management - track record of developing stars.
- Ability and willingness to drive projects independently, working efficiently to deliver results rapidly and engaging the relevant stakeholders throughout the process.
- Accounting/Financial knowledge and commercial acumen.
- Master's degree in statistics, economics, mathematics or similar discipline.
- Solid technical/functional knowledge in statistics.
- Familiarity with scrum/agile methodology.
- Other helpful skills - T-SQL, batch scripting, ODBC, data mining, Hadoop.
- Experience with Wallet and Financial Payment systems, including integration and data analysis.
- Understanding of digital payment processes and financial transaction data.
- Equal Opportunity Employer.
- At Agoda, we pride ourselves on being a company represented by people of all different backgrounds and orientations. We prioritize attracting diverse talent and cultivating an inclusive environment that encourages collaboration and innovation. Employment at Agoda is based solely on a person's merit and qualifications. We are committed to providing equal employment opportunity regardless of sex, age, race, color, national origin, religion, marital status, pregnancy, sexual orientation, gender identity, disability, citizenship, veteran or military status, and other legally protected characteristics.
- We will keep your application on file so that we can consider you for future vacancies and you can always ask to have your details removed from the file. For more details please read our privacy policy.
- To all recruitment agencies: Agoda does not accept third party resumes. Please do not send resumes to our jobs alias, Agoda employees or any other organization location. Agoda is not responsible for any fees related to unsolicited resumes.
Skills:
Big Data, Java, Python
Job type:
Full-time
Salary:
negotiable
- Background in programming, databases and/or big data technologies OR.
- BS/MS in software engineering, computer science, economics or other engineering fields.
- Partner with Data Architect and Data Integration Engineer to enhance/maintain optimal data pipeline architecture aligned to published standards.
- Assemble medium, complex data sets that meet functional /non-functional business requirements.
- Design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction transformation, and loading of data from a wide variety of data sources big data technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Ensure technology footprint adheres to data security policies and procedures related to encryption, obfuscation and role based access.
- Create data tools for analytics and data scientist team members.
- Functional Competency.
- Knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- In depth knowledge of data engineering discipline.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Minimum of 5+ years' hands-on experience with a strong data background.
- Solid programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Hadoop ecosystem, Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
- Practical knowledge across data extraction and transformation tools - traditional ETL tools (e.g. Informatica, Ab Initio, Altryx) as well as more recent big data tools.
- Educational.
- Background in programming, databases and/or big data technologies OR.
- BS/MS in software engineering, computer science, economics or other engineering fields.
- Partner with Data Architect and Data Integration Engineer to enhance/maintain optimal data pipeline architecture aligned to published standards.
- Assemble medium, complex data sets that meet functional /non-functional business requirements.
- Design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction transformation, and loading of data from a wide variety of data sources big data technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including Domain leads, and Teams to assist with data-related technical issues and support their data infrastructure needs.
- Ensure technology footprint adheres to data security policies and procedures related to encryption, obfuscation and role based access.
- Create data tools for analytics and data scientist team members.
- Functional Competency.
- Knowledge of data and analytics framework supporting data lakes, warehouses, marts, reporting, etc.
- Defining data retention policies, monitoring performance and advising any necessary infrastructure changes based on functional and non-functional requirements.
- In depth knowledge of data engineering discipline.
- Extensive experience working with Big Data tools and building data solutions for advanced analytics.
- Minimum of 5+ years' hands-on experience with a strong data background.
- Solid programming skills in Java, Python and SQL.
- Clear hands-on experience with database systems - Hadoop ecosystem, Cloud technologies (e.g. AWS, Azure, Google), in-memory database systems (e.g. HANA, Hazel cast, etc) and other database systems - traditional RDBMS (e.g. Teradata, SQL Server, Oracle), and NoSQL databases (e.g. Cassandra, MongoDB, DynamoDB).
- Practical knowledge across data extraction and transformation tools - traditional ETL tools (e.g. Informatica, Ab Initio, Altryx) as well as more recent big data tools.
Skills:
Business Development, Creative Thinking, Project Management, English
Job type:
Full-time
Salary:
negotiable
- Strong analytical and problem-solving skills to identify commercial opportunity.
- Working well in a cross-disciplinary team with different types of stakeholders (IT, Agency, Business, Management).
- Business Development of Data Intelligence for corporate strategy.
- Analyze internal and external data in various aspects to identify threats & opportunities and provide information/report for management or related business unit team to plan activities and strategies.
- Participate in the initiative's development plan of business unit / brand plans and align with corporate strategy, objectives and KPIs.
- Coordinate and consolidate with the related department to implement a project or tracking the project progress and provide corrective supervision if necessary.
- Create and deliver insights report on new ideas to the management team or business units and seek appropriate decisions, directions, and approvals.
- Bachelor s or Master s degree in business or related field of study.
- Minimum 5-8 years Performance Management function / Commercial Analyst roles.
- Experience in corporate/channel/brand/segment strategy.
- Experience working in Data Analytic related projects.
- Excellent analytical and problem-solving skills.
- Ability to apply logical and creative thinking to solve complex business problems.
- Ability to define the problems and frame answers in a logical and structured manner.
- Good project management, team leadership and sense of ownership.
- Good co-ordination skill with positive attitude and ability to work under pressure.
- Strong communications, customer relationship and negotiation skills.
- Good command of both written and spoken English.
- TECHNICAL SKILLS: Basic understanding of data ecosystem, Advanced skills in dashboard and BI tools.
- Conceptual knowledge of data and analytics, ETL, reporting tools, data governance, data warehousing, structured and unstructured data.
Skills:
Data Analysis, Python, SQL
Job type:
Full-time
Salary:
negotiable
- Data Collection and Cleaning: Gather data from various sources and prepare it for analysis by cleaning and checking for completeness or errors.
- Data Analysis: Utilize statistical techniques to study, analyze, discover insights, and derive meaningful data-driven conclusions.
- Reporting: Create reports in various formats, presenting data through graphs, features, or figures for stakeholders' use.
- Collaboration: Work with other teams to plan data analysis or resolve specific issues affecting organizational goals.
- Performance Monitoring: Define and track key performance indicators (KPIs) to measure the effectiveness and efficiency of various projects.
- Strategic Planning: Assist the management team in strategic planning by using data analysis to improve workflows and processes.
- Bachelor's or master's degree in computer science, information systems, or a related field.
- Strong programming skills in languages such as PowerBI, R, Python, SQL. C++.
- At least 5 year experience with data modeling, database design, and data warehousing concepts.
- Proficiency in working with relational databases (e.g., MySQL, PostgreSQL).
- Knowledge of ETL tools and techniques for data integration and transformation is a plus.
- Location: BTS Ekkamai
- Working Day: Mon-Fri.
Experience:
No experience required
Job type:
Full-time
- Manage projects as required on initiatives to ensure they are delivered on time, within budget, and actively tracked. .
- Ability to work on large initiatives independently. .
- Act as scrum master to ensure complete delivery of work for a Sprint/Release. By ensuring artefacts are complete and ready for development to begin and assisting teams. .
- Creation of project artefacts: Project Request (PR), Request for Proposal (RFP), Project Initiation Document (PID), Business Requirements, Functional Requirements, Use case, Flow charts, Project Solution Architecture Plan (PSAP), etc. in various formats, for any initiatives related to Disruptive Digital Solution. .
- Facilitate workshops & stakeholder sessions to elicit requirements and specifications for artefacts & deliverables. .
- Support Disruptive Digital Solution Documents before being distributed.
- Must have experiences in Technical Lead, Squad Lead, or related fields. .
- Familiar with Software Development Life Cycle and Change Management Process. .
- Has experience in Java spring boot, SQL, and ETL.
- Experience in implementing and supporting large scale project/system in Banking is preferred. .
- Analytical skills and the ability to see the connections between layers of business operations. .
- Ability to consult management and engineering teams with technical advice.
- 1