- No elements found. Consider changing the search query.
Skills:
Apache, Compliance, Automation
Job type:
Full-time
Salary:
negotiable
- Design, develop, and maintain robust and scalable data pipelines using tools such as Apache Airflow, PySpark, and cloud-native services (e.g., Azure Data Factory, Microsoft Fabric Pipelines)..
- Manage data ingestion from APIs, files, and databases into data lakes or data warehouses (e.g., Microsoft Fabric Lakehouse, Iceberg, DWS)..
- Ensure seamless data integration across on-premise, cloud, and hybrid environments..
- Implement data validation, standardization, and transformation to ensure high data quality..
- Apply data encryption, masking, and compliance controls to maintain security and privacy standards..
- AI & Intelligent AutomationCollaborate with Data Scientists to deploy ML models and integrate predictive insights into production pipelines (e.g., using Azure Machine Learning or Fabric Notebooks)..
- Support AI-powered automation and data insight generation through tools like Microsoft Co-pilot Studio or LLM-powered interfaces (chat-to-data)..
- Assist in building lightweight AI chatbots or agents that leverage existing datasets to enhance business efficiency..
- Qualifications & Skills3-5+ years of experience in Data Engineering or AI Engineering roles.
- Proficiency in Python, SQL, and big data frameworks (Apache Airflow, Spark, PySpark)..
- Experience with cloud platforms: Azure, Huawei Cloud, or AWS.
- Familiar with Microsoft Fabric services: OneLake, Lakehouse, Notebooks, Pipelines, and Real-Time Analytics..
- Hands-on with Microsoft Co-pilot Studio to design chatbots, agents, or LLM-based solutions..
- Experience in ML model deployment using Azure ML, ModelArts, or similar platforms.
- Understanding of vector databases (e.g., Qdrant), LLM orchestration (e.g., LangChain), and prompt engineering is a plus.
Job type:
Full-time
Salary:
negotiable
- Design and implement the methods for storing and retrieving the data and monitoring the data pipelines, starting from the ingestion of raw data sources, transforming, cleaning and storing, enrich the data for promptly using for both of structured and unstructured data, and working with the data lake and cloud which is the big data technology.
- Develop data pipeline automation using Azure technologies, Databricks and Data Factory
- Understand data, reports and dashboards requirements, develop data visualization using Power BI, Tableau by working across workstreams to support data requirements including reports and dashboards and collaborate with data scientists, data analyst, data governance team or business stakeholders on several projects
- Analyze and perform data profiling to understand data patterns following Data Quality and Data Management processes
- 3 years+ experience in big data technology, data engineering, data analytic application system development.
- Have an experience of unstructured data for business Intelligence or computer science would be advantage.
- Technical skill in SQL, UNIX and Shell Script, Python, R Programming, Spark, Hadoop programming.
Skills:
ETL, Power BI, SQL
Job type:
Full-time
Salary:
negotiable
- รับผิดชอบการออกแบบ Data model ที่รองรับการใช้งานระยะยาว.
- ช่วยวางแผนการสร้าง Data mart ให้เหมาะกับ use case ต่างๆ.
- ทำงานร่วมกับทีม ETL / Data Engineer เพื่อจัด schema, pipeline ให้ตรงกับความต้องการ.
- สื่อสารและแปล requirement ธุรกิจมาเป็น solution เชิงข้อมูลได้ดี.
- ทำหน้าที่วิเคราะห์ข้อมูลเชิงลึกให้กับหน่วยธุรกิจ.
- ออกแบบและพัฒนา Dashboard / Report บน Power BI อย่าง advance.
- ดึงข้อมูลจาก DWH โดยใช้ SQL ที่ซับซ้อน รวมถึงการทำ data wrangling.
- มีประสบการณ์ทำงานในสายงาน Data (BA หรือ SA) 5 ปี.
- มีความเชี่ยวชาญภาษา SQL เป็นอย่างดี.
- มีความเชี่ยวชาญ Visualization Tool เช่น Power BI และหรือ Data tools อื่นๆที่เกี่ยวกับ Data.
- มีความรู้ความเข้าใจใน Data Warehouse Concept, ETL เป็นอย่างดี.
- ทักษะการออกแบบ Data model, Data Mart.
- ทักษะในการวิเคราะ/แก้ไขปัญหา.
- Contact Information:-.
- K. Sawarin Tel.
- Office of Human Capital.
- DIGITAL AND TECHNOLOGY SERVICES CO., LTD.
- F.Y.I Center 2525 Rama IV Rd, Khlong Tan, Khlong Toei, Bangkok 10110.
- MRT QSNCC Station Exit 1.
Skills:
Compliance, Python, SQL
Job type:
Full-time
Salary:
negotiable
- Lead System Analyst/Senior Data Engineer assigns to work on IECC Project and Finance-Risk and Compliance Data initiatives to support solution design and data integration between upstream applications, downstream applications, and business users.
- To design, build, and operate reliable data pipelines across batch, near-real-time, and real-time workloads.
- To utilize multiple technologies (e.g. Python, SQL/Stored Procedures, ETL/ELT tools) to ingest, transform, and deliver governed, audit-ready data.
- To orchestrate and monitor jobs, implement data quality controls, and ensure security, lineage, and observability, while modernizing existing workflows with automation, testing, and performance tuning.
- Build and maintain ingestion, transformation, and delivery pipelines that produce governed, audit-ready datasets.
- Use Python, SQL/Stored Procedures, and ETL/ELT frameworks (or any relevant technologies) to implement scalable and reusable data pipeline components.
- Orchestrate and monitor workloads (e.g., DAGs/schedulers), ensuring reliability, idempotency and rerun ability.
- Enforce data quality (completeness, validity, accuracy, timeliness, uniqueness) and reconciliation checks.
- Ensure security and compliance: access control, PII handling, encryption, and audit logging.
- Design and manage workflow orchestration for reliable execution, monitoring, and failure recovery with Airflow/Control-M/ESP (DAGs, retries, backfills, idempotency).
- Collaborate with Architects/Stewards to apply a Shared Canonical Model (CDM) and data standards.
- Implement security controls (RBAC/ABAC), PII masking, encryption in-transit/at-rest, and auditable logs.
- Maintain runbooks, technical specifications (e.g. data mapping), and contribute to CI/CD (Git, artifacts, release notes).
- Monitor pipelines (SLIs/SLOs), diagnose incidents, and drive continuous performance and cost improvements.
- Promote data literacy and a data-driven culture through cross-functional collaboration..
- Apply now if you have these advantages.
- Bachelor's / Master degree in Computer Engineer, Computer Science, Information Technology, or related fields.
- At least 8-12 years as System Analyst / Data Engineer, 2-3 years in banking industry.
- Strong background in one or more: large-scale data processing, data infrastructure engineering, or data modeling.
- Solid grasp of CDC patterns, schema-drift control, robust error handling, and recovery/replay.
- Proven track record improving pipelines via automation, testing, and performance tuning.
- Exposure to cloud data platforms (AWS/Azure/GCP), Databricks/Spark Structured Streaming is a plus.
- Proficient in Python and SQL (or any relevant programming languages) and be able to apply solid software engineering practices (testing, version control, code reviews).
- Strong SQL (complex queries, optimization) and Python (DB-API/pandas or PySpark) comfortable with Unix shell.
- Experience with one or more: Talend, IBM DataStage, Airflow, Kafka, Spark, Trino/Presto.
- Curious, resilient, and critical thinker, open to feedback and continuous improvement.
- Financial services, risk and regulatory data experience (e.g., IECC, IFRS9, Basel, BOT, AML, Credit Risk, Compliance) is an advantage.
- Why join Krungsri?.
- As a part of MUFG (Mitsubishi UFJ Financial Group), we a truly a global bank with networks all over the world.
- We offer a striking work-life balance culture with hybrid work policies (2 days minimum in office per week).
- Unbelievable benefits such as attractive bonuses, employee loan with special rates and many more..
- Apply now before this role is close. **.
- FB: Krungsri Career(http://bit.ly/FacebookKrungsriCareer [link removed]).
- LINE: Krungsri Career (http://bit.ly/LineKrungsriCareer [link removed]).
- Talent Acquisition Department
- Bank of Ayudhya Public Company Limited
- 1222 Rama III Rd., Bangpongpang, Yannawa, Bangkok 10120.
- หมายเหตุ ธนาคารมีความจำเป็นและจะมีขั้นตอนการตรวจสอบข้อมูลส่วนบุคคลเกี่ยวกับประวัติอาชญากรรมของผู้สมัคร ก่อนที่ผู้สมัครจะได้รับการพิจารณาเข้าร่วมงานกับธนาคารกรุงศรีฯ.
- Remark: The bank needs to and will have a process for verifying personal information related to the criminal history of applicants before they are considered for employment with the bank..
- Applicants can read the Personal Data Protection Announcement of the Bank's Human Resources Function by typing the link from the image that stated below.
- EN (https://krungsri.com/b/privacynoticeen).
- ผู้สมัครสามารถอ่านประกาศการคุ้มครองข้อมูลส่วนบุคคลส่วนงานทรัพยากรบุคคลของธนาคารได้โดยการพิมพ์ลิงค์จากรูปภาพที่ปรากฎด้านล่าง.
- ภาษาไทย (https://krungsri.com/b/privacynoticeth).
Skills:
Tableau, Power BI, Excel
Job type:
Full-time
Salary:
negotiable
- Leads and improves analytical programs and presents the results.
- Analyzes the variable granular data of the business and regularly uses advanced models to deliver business insights across diverse business domains.
- Answers and anticipates critical business questions and opportunities and delivers insights to the business in ways that make significant impact.
- Demonstrates use of data visualization (Tableau, Power BI, Excel), and analytic tools (Python, R, SQL, KNIME/Alteryx) to grasp the business insights from mountains of data.
- Collaborates with multifunctional teams (Operations, Initiatives, and IT etc.) to improve key business priorities based on data-driven insights.
- Lead the roll-out and development of digital solution per business needs.
Skills:
Big Data, SQL, Hadoop
Job type:
Full-time
Salary:
negotiable
- Develop and maintain robust data pipelines to ingest, process, and transform raw data into formats suitable for LLM training.
- Conduct meeting with users to understand the data requirements and perform database design based on data understanding and requirements with consideration for performance.
- Maintain data dictionary, relationship and its interpretation.
- Analyze problem and find resolution, as well as work closely with administrators to monitor performance and advise any necessary infrastructure changes.
- Work with business domain experts, data scientists and application developers to identify data that is relevant for analysis.
- Develop big data solutions for batch processing and near real-time streaming.
- Own end-to-end data ETL/ELT process framework from Data Source to Data warehouse.
- Select and integrate appropriate tools and frameworks required to provide requested capabilities.
- Design and develop BI solutions.
- Hands-on development mentality, with a willingness to troubleshoot and solve complex problems.
- Keep abreast of new developments in the big data ecosystem and learn new technologies.
- Ability to effectively work independently and handle multiple priorities.
- Bachelor degree or higher in Computer Science, Computer Engineering, Information Technology, Management Information System or an IT related field.
- 3+ year's experiences in Data Management or Data Engineer (Retail or E-Commerce business is preferrable).
- Expert experience in query language (SQL), Databrick SQL, PostgreSQL.
- Experience in Big Data Technologies like Hadoop, Apache Spark, Databrick.
- Experience in Python is a must.
- Experience in Generative AI is a must.
- Knowledge in machine/statistical learning, data mining is a plus.
- Strong analytical, problem solving, communication and interpersonal skills.
- Having good attitude toward team working and willing to work hard.
- CP AXTRA | Lotus's
- CP AXTRA Public Company Limited.
- Nawamin Office: Buengkum, Bangkok 10230, Thailand.
- By applying for this position, you consent to the collection, use and disclosure of your personal data to us, our recruitment firms and all relevant third parties for the purpose of processing your application for this job position (or any other suitable positions within Lotus's and its subsidiaries, if any). You understand and acknowledge that your personal data will be processed in accordance with the law and our policy. .
Skills:
Research, ETL, Automation
Job type:
Full-time
Salary:
negotiable
- Lead the design and development of data architecture, ensuring scalability, security, and alignment with business strategy.
- Oversee the collection, transformation, and integration of data from multiple internal and external sources.
- Conduct advanced research and troubleshooting to address complex business and technical problems.
- Design, build, and optimize data pipelines and ETL processes to handle large-scale and real-time data.
- Implement automation solutions to minimize manual intervention and improve data efficiency.
- Provide technical leadership and mentorship to junior engineers, ensuring best practices in coding, testing, and deployment.
- Collaborate with cross-functional stakeholders including Data Scientists, Analysts, and Business Leaders to deliver actionable insights.
- Evaluate and recommend new tools, frameworks, and technologies to enhance data engineering capabilities.
- Job SpecificationBachelor s Degree in Information Technology, Computer Science, Statistics, Mathematics, Business, or a related field (Master s Degree is a plus).
- Minimum of 5 years experience in data engineering, with at least 2 years in a senior or lead role.
- Proven expertise in the data analytics lifecycle, including business problem framing, KPI/metrics design, exploratory analysis, and presenting data insights.
- Strong hands-on experience with cloud platforms (AWS, GCP, Azure) and advanced programming skills in Python, Java, PySpark.
- Solid knowledge of data processing, ETL frameworks, data warehousing, and messaging queue systems (e.g., Kafka).
- Demonstrated experience in designing highly scalable, resilient, and secure data systems.
Experience:
3 years required
Skills:
ETL, Apache, Python, English
Job type:
Full-time
Salary:
negotiable
- Amaris Consulting is an independent technology consulting firm providing guidance and solutions to businesses. With more than 1000 clients across the globe, we have been rolling out solutions in major projects for over a decade - this is made possible by an international team of 7,600 people spread across 5 continents and more than 60 countries. Our solutions focus on four different Business Lines: Information System & Digital, Telecom, Life Sciences and Engineering. We're focused on building and nurturing a top talent community where all our team members can achieve their full pot ...
- Brief Call: Our process typically begins with a brief virtual/phone conversation to get to know you! The objective? Learn about you, understand your motivations, and make sure we have the right job for you!
- Interviews (the average number of interviews is 3 - the number may vary depending on the level of seniority required for the position). During the interviews, you will meet people from our team: your line manager of course, but also other people related to your future role. We will talk in depth about you, your experience, and skills, but also about the position and what will be expected of you. Of course, you will also get to know Amaris: our culture, our roots, our teams, and your career opportunities!
- Case study: Depending on the position, we may ask you to take a test. This could be a role play, a technical assessment, a problem-solving scenario, etc.
- As you know, every person is different and so is every role in a company. That is why we have to adapt accordingly, and the process may differ slightly at times. However, please know that we always put ourselves in the candidate's shoes to ensure they have the best possible experience.
- We look forward to meeting you!
- Design and optimize data pipelines and ETL/ELT workflows using Databricks and Apache Spark.
- Build and maintain data models and data lakes to support analytics and reporting.
- Develop reusable Python code for transformation, orchestration, and automation.
- Implement and tune complex PySpark and SQL queries for large-scale data processing.
- Collaborate with Data Scientists, Analysts, and Business Units to deliver scalable solutions.
- Ensure data quality, governance, and metadata management across projects.
- Manage Azure cloud services for data infrastructure and deployment.
- Support daily operations and performance of the Databricks platform.
- ABOUT YOU
- 3+ years of experiences in Data Engineering.
- Experience with Databricks, Unity Catalog, Apache Spark, and distributed data processing.
- Strong proficiency in Python, PySpark, SQL.
- Knowledge of data warehousing concepts, data modeling, and performance optimization.
- Experience with Azure cloud data platforms (e.g., Azure Synapse).
- Familiarity with CI/CD and version control (Git, BitBucket).
- Understanding of real-time data streaming and tools such as Qlik for replication.
- Academic background: Bachelor's or Master's in Computer Science, Engineering, or related field.
- Fluent English. Another language is a plus.
- You have excellent problem-solving skills and can work independently as well as in a team.
- WHY AMARIS?
- Global Diversity: Be part of an international team of 110+ nationalities, celebrating diverse perspectives and collaboration.
- Trust and Growth: With 70% of our leaders starting at entry-level, we're committed to nurturing talent and empowering you to reach new heights.
- Continuous Learning: Unlock your full potential with our internal Academy and over 250 training modules designed for your professional growth.
- Vibrant Culture: Enjoy a workplace where energy, fun, and camaraderie come together through regular afterworks, team-building events, and more.
- Meaningful Impact: Join us in making a difference through our CSR initiatives, including the WeCare Together program, and be part of something bigger.
- Equal opportunity
- Amaris Consulting is proud to be an equal opportunity workplace. We are committed to promoting diversity within the workforce and creating an inclusive working environment. For this purpose, we welcome applications from all qualified candidates regardless of gender, sexual orientation, race, ethnicity, beliefs, age, marital status, disability or other characteristics.
Job type:
Full-time
Salary:
negotiable
- We are looking for a skilled Data Engineer to join our team and help build and maintain our data infrastructure. The ideal candidate will be responsible for designing, implementing, and managing our data processing systems and pipelines. You will work closely with data scientists, analysts, and other teams to ensure efficient and reliable data flow throughout the organization.
- Design, develop, and maintain scalable data pipelines for batch and real-time processing.
- Implement ETL processes to extract data from various sources and load it into data warehouses or data lakes.
- Optimize data storage and retrieval processes for improved performance.
- Collaborate with data scientists and analysts to understand their data requirements and provide appropriate solutions.
- Ensure data quality, consistency, and reliability across all data systems.
- Develop and maintain data models and schemas.
- Implement data security measures and access controls.
- Troubleshoot data-related issues and optimize system performance.
- Stay up-to-date with emerging technologies and industry trends in data engineering.
- Document data architectures, pipelines, and processes.
- Bachelor's degree in Computer Science, Engineering, or a related fields 2. 2-4 years of experience in data engineering or similar roles 3. Strong programming skills in Python, Java, or Scala 4. Proficiency in SQL and experience with relational databases (e.g., Databrick, PostgreSQL, MySQL) 5. Familiarity with cloud platforms (AWS, Azure, or Airflow) and their data services 6. Knowledge of data warehousing concepts and ETL best practices 7. Experience with version control systems (e.g., Git) 8. Understanding of data cleansing, data modeling and database design principles 9. Solid problem-solving skills and attention to detail 10. Good communication skills and ability to work with technical and non-technical team members.
- Experience with Azure data platform (ADF, Databrick) 2. Familiarity with data visualization tools (e.g., Tableau, Power BI) 3. Knowledge of stream processing technologies (e.g., Kafka, API, Google Big Query, MongoDB, SFTP sources) 4. Experience with containerization technologies (e.g., Docker).
- Experience to deal with large data and optimization skill in development.
- Understanding of machine learning concepts and data science workflows.
Skills:
Compliance, AutoCAD
Job type:
Full-time
Salary:
negotiable
- Assist in the design and development of mechanical systems including HVAC, Fire protection, and Hydraulic systems tailored for data center needs.
- Help plan and manage maintenance activities for mechanical systems, ensuring adherence to industry standards and operational efficiency.
- Maintain accurate records of mechanical system designs, maintenance activities, and compliance with safety regulations.
- Participate in site inspections to assess mechanical systems' condition and ensure compliance with design specifications.
- Assist in coordinating with third-party vendors for maintenance and upgrades, ensuring that all work meets established standards.
- Be available to respond to on-site incidents and assist senior engineers in troubleshooting mechanical failures.
- Engage in ongoing training and professional development opportunities to stay updated on the latest technologies in the data center industry.
- Job QualificationsBachelor s degree in mechanical engineering or a related field is required.
- more than 5 years of experience in mechanical engineering, preferably within a data center or critical environment.
- Basic understanding of HVAC systems, mechanical design principles, and relevant software tools (e.g., AutoCAD).
- Strong problem-solving abilities to identify issues and propose effective solutions.
- Good verbal and written communication skills for effective collaboration with team members and vendors.
- Ability to work well within a team environment while also being capable of taking initiative when necessary.
- Fluent in English both written and verbal (Minimum 750 TOEIC score).
- Goal-Oriented, Unity, Learning, Flexible.
Job type:
Full-time
Salary:
negotiable
- The Senior Data Engineer position plays a key role in designing, developing, and managing cloud-based data platforms, as well as creating data structures for high-level data analysis, and works with business and technical teams to ensure that data management is appropriate and supports organizational goals.
- Responsible for the design, construction, and maintenance of optimal and scalable data pipeline architectures on cloud platforms (e.g., GCP, AWS, Azure).
- Oversee the development and management of complex ETL/ELT processes for data ingesti ...
- Author and optimize advanced, high-performance SQL queries for complex data transformation, aggregation, and analysis.
- Leverage the Python programming language for automation, scripting, and the development of data processing frameworks.
- Administer and optimize cloud-based data warehouse solutions and associated data lakes.
- Collaborate professionally with data scientists, analysts, and key business stakeholders to ascertain data requirements and deliver effective technical solutions.
- Provide mentorship to junior engineers and champion the adoption of data engineering best practices throughout the organization.
- Bachelor s degree or higher in Computer Science, Information Technology, Engineering, or a related field.
- At least 5 years of experience working in a data engineering or related position.
- Proficient in advanced SQL, including query optimization and performance tuning.
- Experienced in managing and designing architecture on at least one major cloud platform (Google Cloud Platform, AWS, or Azure).
- Skilled in using Python for data processing and advanced pipeline development.
- Experienced with tools and technologies for data ingestion, connectivity, and management.
- Deep understanding of data modeling principles, data warehousing methodologies, and modern data architecture.
- Excellent analytical and problem-solving skills.
- Communication and teamwork skills.
- Ability to plan and manage tasks effectively.
Experience:
4 years required
Skills:
Electrical Engineering, Mechanical Engineering, Excel, English
Job type:
Full-time
Salary:
negotiable
- Supervise contractors who perform servicing or preventive maintenance.
- Perform limited maintenance tasks to include: filter changes, battery system PMs, and Rack PDU & Rack ATS replacements.
- Perform root cause analysis for operational issues.
- Troubleshoot facility and rack level events.
- Ensure all personnel on-site follow safety protocols.
- Work on-call and a rotating schedule as needed.
- Take daily operational readings and provide metrics reporting to senior engineers.
- Perform basic support concepts such as ticketing systems, root cause analysis, and task prioritization.
- Diverse Experiences
- AWS values diverse experiences. Even if you do not meet all of the qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn t followed a traditional path, or includes alternative experiences, don t let it stop you from applying.
- Why AWS?
- Amazon Web Services (AWS) is the world s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating that s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses.
- Inclusive Team Culture
- AWS values curiosity and connection. Our employee-led and company-sponsored affinity groups promote inclusion and empower our people to take pride in what makes us unique. Our inclusion events foster stronger, more collaborative teams. Our continual innovation is fueled by the bold ideas, fresh perspectives, and passionate voices our teams bring to everything we do.
- Mentorship & Career Growth
- We re continuously raising our performance bar as we strive to become Earth s Best Employer. That s why you ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional.
- Work/Life Balance
- We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there s nothing we can t achieve in the cloud.
- BASIC QUALIFICATIONS.
- Associates Degree or Technical Degree in Electrical Engineering, Mechanical Engineering or relevant discipline.
- Fluent in English language, both written and spoken.
- 2+ working in a Data Center or Mission Critical Environment.
- PREFERRED QUALIFICATIONS.
- Bachelor s Degree in Electrical Engineering, Mechanical Engineering or relevant discipline.
- 4+ years of Data Center Operation Experience.
- Fundamental knowledge of network design and layout as well as low voltage (copper/ fiber) cabling.
- 2+ with Microsoft Excel, Word, and Outlook.
- Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you re applying in isn t listed, please contact your Recruiting Partner.
Skills:
Sales, Negotiation, Software Development, English
Job type:
Full-time
Salary:
negotiable
- Provide technical consultancy and design solutions for Enterprise Customers, primarily focusing on Enterprise Data Service products.
- Support the Sales team with technical opportunities, prepare technical proposals, and respond to customer requirements.
- Take ownership of customer solutions and architecture design, including solution costing.
- Collaborate with vendors, providers, and partners to optimize project investment costs for enhanced competitiveness.
- Coordinate and hand over customer solutions to the delivery and operation teams.
- Coordinate with vendors and partners to explore new potential technologies.
- Share technical and service knowledge with internal stakeholders.
- Bachelor of Engineering (Computer, IT, Telecommunication) or Computer Science.
- 5-8 years of experience in IT, Telecommunication, Data Communication Service, Pre-Sales, Post-Sales, or IP Network Operations & Planning.
- At least 5 years of professional experience as a Pre-Sales Engineer in a technical environment.
- Excellent presentation and negotiation skills.
- Strong analytical skills, excellent interpersonal and communication skills.
- Fluency in English is preferable.
- Strong knowledge and experience in Optical, DWDM, MPLS, and Routing are preferable.
- Extensive experience in the ISP, Internet peering, and International connectivity industry is preferable.
- Relevant software development experience or commercial/sales experience.
- Ability to understand and determine an enterprise customer's needs and how AIS's products and solutions might best fit through a consultative approach.
- Ability to thrive in a fast-paced environment, set demanding expectations, and consistently exceed them.
- Must be organized, a self-starter, and capable of delivering high-quality work without constant tactical oversight.
Skills:
Assurance
Job type:
Full-time
Salary:
negotiable
- Develop and analyze Enterprise Service revenue to understand Product and Service trends within the AIS Group, ensuring that revenue collection, promotion packages, and new services are properly executed according to the company s business conditions.
- Identify suitable QA methods to reduce revenue loss and prevent errors in the Line Operation's work, sharing knowledge to strengthen revenue assurance.
- Verify the completeness and accuracy of service calculations, promotion packages, and offerings for corporate customers.
- Review the calculation of Postpaid Voice and IDD services, as well as IR services in the RBM system, ensuring they are correct, complete, and in line with the rates and conditions set by the company, with no abnormalities that could lead to revenue loss (Real Loss/Opportunity Loss).
- Communicate, coordinate, and follow up on issues causing revenue loss, identify the root causes, and work with relevant departments to resolve the problems, reducing revenue loss, particularly in Voice services.
- Analyze data from various sources within the scope of responsibility using data analytics skills, reflecting trends, performance, efficiency, and effectiveness of Products and Services. Identify abnormalities impacting revenue loss (Real Loss/Opportunity Loss & Fraud), and conduct audits and monitoring.
- Prepare analysis reports to support management strategies and assess risks in various areas.
- Perform other duties as assigned by the supervisor..
Experience:
2 years required
Skills:
Business Statistics / Analysis, English
Job type:
Full-time
Salary:
฿27,100 - ฿35,000, negotiable
- To communicate and coordinate with all channels of sales team to re-confirm output and training..
- To activate/implement special project; to monitor operation and define the key success factor..
- Planning and monitoring all sales operation in TT channel of sales system smoothly..
- Work closely with the team to understand their data needs and provide actionable insights..
- Initiate, design big picture by canvas tools to support team..
- Job Qualification.
- Bachelor's Degree in Business Administration, Computer Science or related fields..
- Have at least 1-3 years' experience of Support sales team and design BI dashboard..
- Strong in BI Dashboard and Data Analysis and can design visualize presentation..
- Able to clearly communication with internal and external, easy-to-understand and actionable way..
- Able manage multiple projects simultaneously with attention to detail..
- Works well with cross-functional teams, stays flexible and solution-oriented in dynamic environments..
- Stays positive and solutions-focused under pressure..
- Fluency in spoken & written English..
Skills:
SQL, Tableau, Power BI
Job type:
Full-time
Salary:
negotiable
- Data Cleaning and Preparation - Need to retrieve data from one or more sources and prepare the data so it is ready for numerical and categorical analysis. Data cleaning also involves handling missing and inconsistent data that may affect your analysis.
- Data Analysis and Exploration - Take a business question or need and turn it into a data question. Then, transform and analyze data to extract an answer to that question. Moreover, find interesting trends or relationships in the data that could bring value to a business.
- Creating Data Visualizations and Communication - Produce reports or build dashboards on your findings and communicate to business stakeholders and managements.
- Statistical Knowledge.
- Mathematical Ability.
- Programming languages, such as SQL.
- Analytic tools such as Tableau, Power BI.
- TeraData, Big data Hadoop Tech, Cloud Tech.
- Bachelor Degrees in MIS, Business, Economic, Computer Science or related field.
- At least 2-3 year of experience with Data Analysis.
- Experienced in designing and architecture BI / Data Analytics Solutions is preferred.
Skills:
Power BI, Tableau, Finance, English
Job type:
Full-time
Salary:
negotiable
- Collect, clean, and analyze data from sales, marketing, and CRM systems.
- Build and maintain dashboards and performance reports (Power BI, Tableau, or Google Data Studio).
- Monitor and evaluate campaign results, product performance, and sales trends.
- Provide data-driven insights to support marketing strategies and business planning.
- Collaborate with sales, finance, and product teams to align business insights.
- Ensure data accuracy and consistency across all reports.
- Bachelor s degree in Business Analytics, Data Science, Statistics, Economics, or related field.
- At least 3 years of experience in data analysis, business intelligence, or commercial analytics.
- Strong skills in Excel, SQL, and data visualization tools (Power BI, Tableau, or Google Data Studio).
- Experience with CRM and marketing analytics tools is a plus.
- Analytical, detail-oriented, and comfortable presenting insights to management.
- Good command of English and Thai, both written and spoken.
Experience:
1 year required
Skills:
SQL, Python, Automation, English
Job type:
Full-time
Salary:
negotiable
- Developing data pipeline and gather into internal tables using SQL, Inhouse-tools and Python.
- Collaborate with stakeholders to prepare, process, and validate data for business needs.
- Make recommendations on improvement, maintenance, or other factors to improve the database system.
- Develop reports, dashboards, and automation solutions using SPARK, SQL, Python, Excel, and in-house tools.
- Ensure data integrity by sanitizing, validating, and aligning numbers with accurate logic.
- Requirements: Master's or bachelor's degree in quantitative fields or a relevant field.
- 1-3 years of experience in Data Analytics, Data Engineering, or Business Intelligence.
- Experience in project & stakeholder management responsibilities.
- Strong SQL skills for data querying and Excel proficiency (Python is a plus).
- Strong English communication skills, both verbal and written.
- Detail-oriented and enjoy building from the ground up.
- Fresh Graduates are welcome.
Experience:
No experience required
Skills:
Python, SQL, Database Administration, English, Thai
Job type:
Full-time
Salary:
฿35,000 - ฿45,000, negotiable
- Guide and train new customers to confidently use our system.
- Monitor customer activity, troubleshoot basic issues, and coordinate with internal teams.
- Analyze and manage customer data to ensure readiness for real-time use.
- Work closely with logistics, operations, and tech teams to deliver a seamless onboarding experience.
- Travel and visit customer sites.
- Experience in Customer Support or Data Analysis is a plus new graduates are welcome to apply.
- Proficiency in Excel and SQL; Python skills are a plus.
- Excellent communication skills in both Thai and English.
- Adaptable, quick to learn, and able to work under pressure.
- Educational background in IT, Computer Science, or related fields is preferred.
- Allows you to apply your skills in data, technology, and customer service.
- Supports your personal and professional development.
Experience:
3 years required
Skills:
ERP, SQL, Power BI, Data Analysis, Business Statistics / Analysis, Thai, English
Job type:
Full-time
Salary:
฿40,000 - ฿60,000, negotiable
- Participate with customer and business users to gather requirements and summarize key points for project development.
- Integrate data from various business systems/sources (e.g., ERP, CRM, Other) to ensure comprehensive and accurate analysis.
- Design and develop ETL (Extract, Transform, Load) processes to extract data from various sources.
- Prepare the data so it is ready for analysis. Data cleaning also involves handling missing and inconsistent data that may affect your analysis.
- Design, Develop and implement data models to serve customer requirement.
- Design, Build and maintain dashboards and reports that visualize key metrics and performance indicators..
- Bachelor Degree in Data Analytic, Computer Engineering, Computer Science, MIS, Statistics or related fields.
- At least 2-3 years experience in Data Analysis or Business Analysis.
- Proficiency in data tools such as SQL, Excel, BI (Power BI and Tableau), or Python.
- Experience in ERP, CRM, Cloud Technology, Software Development is preferred.
- Very good problem solving, negotiation, presentation and communication skill.
- Good written and verbal English communication.
- A collaborative team player with effective communication abilities..
- 1
- 2
- 3
- 4
- 5
- 6
- 17
