Job Description:
We are seeking a highly skilled Data Scientist to drive the adoption of algorithmic decision-making at scale within Group Digital. This role will focus on developing and deploying machine learning models, neural networks, support MLOps practices, to enhance personalization, and automation within our digital products. You will work closely with data engineers, product teams, and business stakeholders to build scalable data pipelines, CI/CD workflows, and ETL processes. The ideal candidate will have experience in retail, personalization web technologies, and cutting-edge AI methods, including Recommendation Engines, Large Language Models (LLMs), Generative AI, and Knowledge Graphs."
Key Responsibilities:
Machine Learning & AI Development
• Develop and optimize predictive and prescriptive models to extract insights and enhance decision-making. Knowledgeable in supervised and unsupervised learning. Apply deep learning and neural network techniques for customer classification and profiling, customer segmentation and personalization.
• Utilize MLOps and GCP services to efficiently deploy, monitor, and maintain ML models in production.
• Implement and fine-tune Large Language Models (LLMs) and Generative AI solutions for automation and user engagement.
• Explore and integrate knowledge graphs to enhance data relationships and improve AI-driven recommendations.
Data Engineering & Pipelines
• Work with data engineers to design and develop robust data pipelines for large-scale ETL processing using SQL and cloud-based solutions (GCP preferred).
• Write complex SQL queries for extracting, transforming, and loading (ETL) data efficiently. Implement CI/CD workflows to automate model training, deployment, and monitoring.
Collaboration & Agile Development
• Work in an Agile/DevOps environment, collaborating with cross-functional teams to drive data-driven innovation.
• Promote a data-centric culture by educating teams on the strategic importance of AI and analytics.
• Clearly communicate complex methodologies, results, and business insights to both technical and non-technical audiences."
What are the Mandatory skills and skill proficiencies required for this position?
"Required Skills & Qualifications:
• Strong experience in Data Science, Machine Learning, or related fields.
• Strong expertise in Python, SQL, and modern ML frameworks (TensorFlow, PyTorch, Scikit-Learn).
• Experience with MLOps tools (MLflow, Kubeflow, Airflow) for model deployment and monitoring.
• Proficiency in cloud platforms (GCP) and scalable data engineering.
• Experience implementing and testing recommendation engines.
• Strong understanding of probability theory, statistics, and experimental design (A/B Testing).
• Experience with collaborative software engineering practices (Agile, DevOps).
• Bachelor's or Master’s degree in Computer Science, Mathematics, Engineering, or related field.
Preferred Qualifications:
• Background in Retail and Personalization Web Technologies.
• Experience with Knowledge Graphs and their integration into AI/ML pipelines.
• Hands-on experience in LLMs (e.g., GPT, BERT, LLaMA, Claude) and Generative AI technologies.
• Understanding of client’s digital ecosystem and data-driven decision-making.
• Proficiency in business intelligence (BI) tools and data visualization.
Why Join Us?
• Work on high-impact AI/ML projects driving digital transformation.
• Collaborate with top-tier professionals in a fast-paced environment.
If you are passionate about machine learning, MLOps, knowledge graphs, and cutting-edge AI, apply now to be part of our innovative team!"
Job Summary
We are seeking a skilled Pega Developer with hands-on experience in Pega PRPC application development, workflow automation, enterprise integrations, and decisioning support. The ideal candidate will have experience designing case types, stages, flows, UI screens, reusable rule components, and declarative rules, along with strong knowledge of system integrations, performance tuning, and Agile delivery. This role requires close collaboration with cross-functional teams to build scalable, efficient, and business-focused applications.
Key Responsibilities
Design and develop Pega PRPC applications including case types, stages, flows, flow actions, and UI screens to automate business processes.
Build and maintain reusable rule components such as Data Pages, Data Transforms, Activities, and Decision Tables to improve standardization and maintainability.
Structure application class models and ruleset hierarchies following CSA/CSSA best practices.
Configure SLAs, routing rules, validation rules, and declarative rules such as Declare Expressions and Constraints.
Develop and support REST/SOAP integrations with external systems for customer, account, and transaction data exchange.
Perform guardrail analysis, performance tuning, and issue resolution using tools like Tracer and PAL.
Support deployment activities and CI/CD processes using tools such as Azure DevOps, Jenkins, Git, and related release pipelines.
Participate in Agile ceremonies, provide story estimations, and deliver development tasks within committed timelines.
Build and execute unit tests and regression validation to improve application quality and reduce production defects.
Design and optimize data flows for customer, product, and interaction data to support business decisioning and operational efficiency.
Contribute to decisioning configurations including engagement policies, arbitration rules, action hierarchies, adaptive models, and predictive scoring where applicable.
Required Qualifications
Bachelor’s degree in Computer Science, Information Technology, Engineering, or related field.
2+ years of experience in Pega application development.
Strong hands-on experience with Pega PRPC 8.x, including case management, flows, UI development, and rule configuration.
Experience with REST/SOAP services, JSON/XML data mapping, and external system integrations.
Understanding of Pega data pages, activities, data transforms, declarative rules, and validation mechanisms.
Experience in performance tuning and debugging using Pega tools such as Tracer and PAL.
Familiarity with Agile/Scrum development methodology.
Good understanding of SQL/databases such as PostgreSQL, MySQL, or Oracle.
Exposure to CI/CD and DevOps tools such as Jenkins, Git, GitHub/GitLab, Azure DevOps, or AWS deployment tools.
Preferred Qualifications
Pega CSA / CSSA / CDH certifications.
Experience with Pega Decisioning / Customer Decision Hub concepts such as engagement policies, arbitration, adaptive models, and predictive scoring.
Exposure to AWS services such as EC2, S3, Lambda, API Gateway, CloudWatch, IAM, SNS, and SQS.
Familiarity with GenAI-assisted development such as rule generation, automated test generation, or logic summarization.
Technical Skills
Pega Platform: Pega PRPC 8.x, Case Management, UI, Flows, Flow Actions, Activities, Data Transforms
Integrations: REST/SOAP, Connectors & Services, JSON/XML Mapping, Data Pages
Databases: PostgreSQL, MySQL, Oracle DB, SQL
DevOps / CI-CD: Azure DevOps, Jenkins, Git, GitHub, GitLab, Maven, AWS CodePipeline, CodeDeploy
Cloud: AWS, familiarity with Azure
Other: Agile, performance tuning, regression testing, decisioning support
About the job
A long‑standing client of ours is looking for an experienced ServiceNow Administrator to support the ongoing development and stability of their platform. This is a hands‑on role within a busy digital operations team, ideal for someone who enjoys improving processes, tightening configuration, and keeping the platform running smoothly day to day.
What you’ll be working on
Administering and maintaining core ServiceNow ITSM modules, including Incident, Problem, Change, Request, CMDB, Knowledge, Asset, and Service Catalog.
Managing platform configuration: forms, UI policies, business rules, client scripts, workflows, and notifications.
Supporting platform upgrades and release cycles, ensuring everything is tested and documented properly.
Working closely with process owners to refine workflows and improve user experience across the organisation.
Monitoring data quality, producing documentation, and helping drive adoption of new features.
What we’re looking for
Proven experience as a ServiceNow Administrator in a mid‑to‑large environment.
Strong understanding of ITSM processes and how they translate into the platform.
Confident troubleshooting issues and implementing clean, scalable solutions.
Comfortable engaging with both technical and non‑technical stakeholders.
Any exposure to Discovery, Service Portal, HRSD, or IntegrationHub is a bonus but not essential.
Job Description:
We are seeking an experienced Java Developer with strong expertise in building scalable enterprise applications. The ideal candidate will have hands-on experience with Java/J2EE technologies, Spring Framework, and Microservices architecture, along with cloud and container platform exposure.
Required Skills:
- Strong experience with Java, J2EE/JEE
- Hands-on experience with Spring and Spring Boot
- Experience developing RESTful Web Services and Microservices
- Solid knowledge of SQL and relational databases
- Experience with AWS cloud services
- Exposure to OpenShift or containerized environments
- Strong problem-solving and debugging skills
Preferred:
- Experience working in Agile/Scrum environments
- Knowledge of CI/CD pipelines and DevOps practices
- experience with Business Analyst, Testing and Financial Services experience.
- Skilled at listening and documenting business and users level requirement
- Gather and document UAT
- Agile and Waterfall
- Jira
Key Responsibilities
- Lead end-to-end project management for a statewide IT implementation.
- Develop and manage project plans, schedules, budgets, risks, scope, quality, and communications.
- Coordinate with DPH leadership, IT teams, vendors, and business stakeholders.
- Establish governance structure and ensure compliance with State CIO and DHHS documentation standards.
- Develop procurement artifacts (RFP/RFQ), technical documentation, and vendor contracts.
- Oversee vendor deliverables and milestone adherence.
- Manage legacy system upgrades, data conversion, and rollout strategies.
- Facilitate Agile (Scrum/Kanban) and Waterfall project methodologies.
- Maintain project reporting within the State Touchdown system.
- Lead change management, training, and implementation readiness activities.
Required Qualifications
- 8+ years of project management experience in large, complex IT implementations.
- 8+ years managing system deployments with multiple stakeholders/clients.
- Strong experience managing scope, cost, schedule, risk, quality, and communications.
- 7+ years experience with Microsoft Project, MS Office, SharePoint.
- 5+ years experience in both Waterfall and Agile methodologies.
- 8+ years vendor management experience on large IT projects.
- Technical background in architecture/infrastructure, databases, and web-based technologies.
- Experience writing RFPs and technical documentation.
- Experience with Agile tools (ServiceNow, Jira, VersionOne, etc.).
- Strong leadership, communication, analytical, and stakeholder management skills.
Preferred Qualifications
- PMP Certification (required/preferred); Scrum Master certification a plus.
- Experience with State or Local Government projects.
- Public Health or Healthcare industry experience.
- Experience leading COTS/SaaS implementations.
- Experience with legacy system modernization and data migration.
- Strong SharePoint knowledge (sites, libraries, lists, groups).
- Experience managing Scrum teams and backlog prioritization.
Principal Demand Planning & Supply Chain Data Analytics Consultant Role
Seeking a hands-on advanced python, SQL, and full-stack data analytics developer with manufacturing and supply chain process impact experience.
Deep Technical Execution Skills
- Advanced Python + Snowflake SQL
- Snowflake architecture & migration across environments
- Interactive visualizations (PowerBI, SIGMA, Qliqview, or Streamlit)
- GitHub automation & CI/CD
- Cloud-native deployments
- Containerization
- API/service-account design
- TDD & rollback strategies
- Large-scale relational data wrangling
Business Process Fluency (Manufacturing / Supply Chain)
- ERP and MRP systems and data structures
- Understanding of operational workflows in manufacturing and supply chain (Order-to-cash, Procure-to-pay, Demand-to-deliver)
- Enterprise-scale process understanding
Communications and Impact
- Translating business requirements into technical delivery and technical issues into business impacts
- ROI-driven analytics
- Multi-geography experience
- Demonstrated business impact
JOB REQUIREMENTS
- Five or more years of professional data science experience in manufacturing and/or supply chain environments
- Hands-on experience working with ERP data and operational processes including order-to-cash, procure-to-pay, and demand-to-deliver
- Proven ability to communicate complex or technical concepts clearly to both technical and non-technical audiences
- Advanced proficiency in Python and SQL; experience wrangling large relational datasets
- Strong proficiency with Snowflake as a data platform
- Strong proficiency in dashboarding and interactive visualizations, including Power BI, SIGMA, Qliqview, or Streamlit
- Experience deploying enterprise-scale solutions in cloud environments
- Proficient in at least one data visualization library such as Matplotlib, Seaborn, or Plotly
- Familiarity with agile development best practices (model training and validation), CI/CD pipeline implementation
- Able to implement and manage GitHub repositories and automations, including developing automated github workflows to move python projects and Snowflake development (queries, procedures, data structures, and data) from dev to test to prod environments.
- Demonstrated ability to drive business impact and ROI through analytics
- Capable of executive-level communication and influence
- Experience leading cross-functional teams across multiple geographies
- Demonstrated ability to translate business requirements into technical projects that accomplish business objectives and achieve business value
- Experience developing and deploying scalable enterprise-grade containerized solutions, interactive front-ends, and back-ends using service accounts, APIs, safe handling of developer keys, and development best practices.
- Experience with test-drive development (TDD), deploying projects safety into production with roll-back options, and other common development best practices
Job Description:
· Expert-level knowledge of .NET Core, C#, MVC, Entity Framework, WCF, Web API and REST and SQL/Oracle databases.
· Working knowledge of JavaScript and jQuery.Should have knowledge in GraphQL besides RESTIntegrate internal services with external APIs using REST API.
· Develop and maintain ASP. NET Core Web APIs.
· Experience with leveraging AI tools or agentic AI to generate requirements, source code, or tests
· Experience in building ASP.Net Core Web applications using MVC or Razor pages.
· Experience in building applications using ASP.Net Core WebApi, GraphQ
· Experience in implementing SSO in ASP.Net Core Web and Web Api applications using OIDC or OAuth.
Role Overview
As a Data Engineer, you will be the technical backbone for our diverse portfolio of global clients. Unlike an in-house role, you will lead the migration, modernization, and optimization of data architectures across different industries. You are a problem-solver who can jump into a client’s messy legacy system and transform it into a high-performing, cloud-native data platform.
Key Service-Based Responsibilities
• Client Delivery: Lead the end-to-end implementation of data solutions, from initial discovery and requirements gathering to final deployment and handover.
• Cross-Platform Engineering: Build and maintain ETL/ELT pipelines across various cloud environments (AWS, Azure, and GCP) depending on the specific client’s ecosystem.
• Legacy Modernization: Help clients migrate from traditional on-premise databases (SQL Server, Oracle) to modern, scalable cloud data warehouses (Snowflake, BigQuery, Databricks).
• Data Strategy: Advise clients on best practices for data governance, security, and cost optimization within their data infrastructure.
• Technical Documentation: Create high-quality architectural diagrams and technical handbooks to ensure client teams can maintain the systems you build.
Technical Requirements
• Cloud Proficiency: Deep expertise in at least one major provider (AWS, Azure, or GCP), with certifications preferred.
• Data Warehousing: Advanced knowledge of Snowflake, Databricks, or Amazon Redshift.
• Code Mastery: Strong proficiency in Python and Advanced SQL (window functions, CTEs, performance tuning).
• Modern Data Stack: Experience with dbt (data build tool), Fivetran/Airbyte, and orchestration tools like Apache Airflow.
• DevOps/DataOps: Experience with Git, CI/CD pipelines, and Terraform/CloudFormation for Infrastructure as Code (IaC).
Preferred Qualifications
• Consulting Experience: Previous experience in a client-facing or agency environment.
• Industry Knowledge: Familiarity with data regulations (GDPR, HIPAA) relevant to different sectors like Finance or Healthcare.
Position Overview: The OSP Field Engineer performs field surveys and collects data to support fiber-optic and telecommunications engineering projects. This role documents existing aerial and underground utility conditions, validates routes, captures measurements, and ensures all information required for accurate design and permitting is collected.
Key Responsibilities
· Perform field walks to document poles, UG paths, utilities, ROW boundaries, and site conditions.
· Capture required photos: pole tags, heights, cable labels, MPOE, handholes, anchors, clearances, and obstructions.
· Measure spans, attachment heights, mid-span clearances, and accurately record GPS coordinates.
· Take detailed field notes following OSP Design Principles and maintain accurate route documentation.
· Convert field notes into Google Earth/KMZ format (preferred).
· Identify constructability issues, easements, conflicts, and permitting requirements.
· Upload and manage data using platforms such as Katapult Pro or IQGEO.
· Communicate findings with project managers, engineering teams, and other stakeholders.
Qualifications
· Experience in OSP fielding, telecom construction, or utility surveying (preferred).
· AT&T experience is preferred (IFP/ASE/GPON exposure a plus).
· Knowledge of Google Earth; ability to create or edit KMZ/KML files is a plus.
· Understanding of FTTx and AT&T GPON design is a plus.
· Must be legally authorized to work in the United States of America.
· Ability to work outdoors in all conditions and perform physically demanding field tasks.
· Valid driver’s license and reliable transportation.
Additional Requirements
· Flexible for short business trips and occasional weekend work (rare).
· Able to manage timelines efficiently while handling multiple projects simultaneously.
· Highly organized, capable of tracking deadlines, deliverables, and project expectations.
Skills
· Field surveying & measurement accuracy
· Photo documentation
· Map/GIS reading
· Problem-solving
· Time management & organization
· Knowledge of aerial/underground telecom infrastructure (preferred)
Role and Responsibilities:-
- Lead design on Financial Process / integration platform / patterns & Support build team
Ensure compliance of the data architecture with relevant financial regulations, accounting standards, and data privacy requirements
Design and implement data security controls, such as data encryption, access controls, and audit trails, across the Oracle Fusion Financials ERP ecosystem.
Collaborate with Security and Compliance teams to ensure adherence to industry standards and regulatory requirements across SEA for insurance industry
Support in definition of solution design, test plan, migration plan and execute data migration activities from legacy systems to Oracle Fusion Financials ERP
Collaborate with the integration specialists and technical experts to ensure seamless data flow and integration between Oracle Fusion Financials modules and other enterprise systems
Skills Required:-
8+ years of experience in designing and implementing data architectures for complex Oracle Fusion financial systems, preferably in the Oracle Subledgers (AP, AR, FA, Projects, Lease Accounting)
Strong expertise in Oracle Fusion Financials ERP Subledger modules and their underlying data structures including period close processes
Familiarity with financial accounting concepts and processes
Responsibilities:
Should have minimum 8-10+ years of experience
.
Good Communication and proactive.
Understand client and stakeholders requirements.
Do due diligence on the task and delivery with high quality on time.
Understand and align with the culture of the customer
Oracle E-Business Suite (EBS) Developer Technical
Experience in Data Collection Processing
String experience in Oracle
Experience in application maintenance and support.
12 Years of experience Required
About The Role
: We
are seeking a skilled Sr. Business Analyst with knowledge of healthcare Payer
& TPA industry and experience with premium billing, payments and
contribution accounting. This role involves requirements gathering, documenting
and analysis to align with the goals of the Program. We are seeking a
passionate, results-oriented candidate who excels in client interactions, and
can collaborate effectively with internal and external stakeholders to analyse
and document requirements for client or product. This person must possess a
blend of business and technical knowledge, experience in process mapping and
must possess strong communication skills.
Primary
Responsibilities:
Requirements
Gathering & Analysis: Collaborate with stakeholders, including business
units, technical teams, and clients, to gather and document detailed business
requirements and process flows for health insurance systems, applications, and
processes.
Solution
Design & Implementation: Work closely with technical teams to translate
business requirements into functional specifications, user stories, and system
designs. Serve as the primary point of contact for development teams to ensure
all requirements are fully understood and addressed in system designs and
solutions.
Testing
& Quality Assurance: Support the testing process by ensuring business
requirements are correctly implemented, validating system functionality, and
conducting user acceptance testing (UAT).
Experience
working and managing a small remote development team.
Essential
Qualifications
Experience:
10-15 years of experience as a Business Analyst in health insurance systems
and/or billing/payment systems.
Technical
Skills: Proficiency in MS Office and other business analysis tools (e.g.,
SQL/similar programming language, Microsoft Office Suite, Data Import/Export
tools).
Knowledge:
Deep understanding of the health insurance landscape, including Contribution
Accounting, Billing and Payments, and System Integrations. Ability to
document requirements and process
Analytical
Skills: Strong problem-solving abilities with a keen eye for detail and a
data-driven approach to analysis.
Communication:
Excellent written and verbal communication skills with the ability to present
complex concepts to both technical and non-technical stakeholders.
Collaboration:
Proven ability to work cross-functionally with diverse teams, including IT,
business operations, external partners, and remote teams.
Healthcare
Certifications: Any relevant certifications in health insurance, healthcare
analytics, or related areas would be an advantage.
Contribution
Accounting Experience: Familiarity with Contribution Accounting, billing and
payment systems is a plus.
Strong
attention to detail and organizational skills.
Ability
to manage multiple tasks and prioritize effectively.
with a commitment to delivering high-quality solutions.
Ability
to adapt quickly to changing requirements and priorities.
Job Description:
GCP Experience
• recent GCP experience
• Experience building data pipelines in GCP
• GCP Dataproc, GCS & BIGQuery experience
• hands-on experience with developing data warehouse solutions and data products.
•hands-on experience developing a distributed data processing platform with Hadoop, Hive or Spark, Airflow or a workflow orchestration solution are required
• hands-on experience in modeling and designing schema for data lakes or for RDBMS platforms.
• Experience with programming languages: Python, Java, Scala, etc.
• Experience with scripting languages: Perl, Shell, etc.
• Practice working with, processing, and managing large data sets (multi TB/PB scale).
• Exposure to test driven development and automated testing frameworks.
• Background in Scrum/Agile development methodologies.
• Capable of delivering on multiple competing priorities with little supervision.
• Excellent verbal and written communication skills.
• Bachelor's Degree in computer science or equivalent experience.
The most successful candidates will also have experience in the following:
• Gitflow
• Atlassian products – BitBucket, JIRA, Confluence etc.
• Continuous Integration tools such as Bamboo, Jenkins, or TFS
12+ years for experience required
Experience as an SAP Project Manager with a successful track record of managing complex S/4 HANA upgrade projects.
Develop a detailed project plan including timelines, milestones, resource allocation, and risk mitigation strategies.
Define project scope, considering business requirements and technical limitations.
Deep understanding of SAP S/4 HANA functionalities across various modules (FI, CO, MM, SD, etc.) and their impact on upgrade process.
Lead the technical team in performing system analysis, data migration, and configuration changes necessary for the S/4 HANA upgrade.
Expertise in project management methodologies (Agile, Waterfall) and tools.
Strong analytical and problem-solving skills to identify and resolve technical challenges during the upgrade.
Excellent communication and interpersonal skills to effectively collaborate with cross-functional teams and stakeholders.
Collaborate with stakeholders to gather functional requirements and prioritize features for upgrade.
Familiarity with data migration strategies and tools for SAP upgrades.
Knowledge of SAP best practices and industry standards related to S/4 HANA upgrade
Experience in Pharma and Life Science industry with CSV documentation is needed
[jobs]
