Job Description:
We are seeking an experienced Java Developer with strong expertise in building scalable enterprise applications. The ideal candidate will have hands-on experience with Java/J2EE technologies, Spring Framework, and Microservices architecture, along with cloud and container platform exposure.
Required Skills:
- Strong experience with Java, J2EE/JEE
- Hands-on experience with Spring and Spring Boot
- Experience developing RESTful Web Services and Microservices
- Solid knowledge of SQL and relational databases
- Experience with AWS cloud services
- Exposure to OpenShift or containerized environments
- Strong problem-solving and debugging skills
Preferred:
- Experience working in Agile/Scrum environments
- Knowledge of CI/CD pipelines and DevOps practices
- experience with Business Analyst, Testing and Financial Services experience.
- Skilled at listening and documenting business and users level requirement
- Gather and document UAT
- Agile and Waterfall
- Jira
Key Responsibilities
- Lead end-to-end project management for a statewide IT implementation.
- Develop and manage project plans, schedules, budgets, risks, scope, quality, and communications.
- Coordinate with DPH leadership, IT teams, vendors, and business stakeholders.
- Establish governance structure and ensure compliance with State CIO and DHHS documentation standards.
- Develop procurement artifacts (RFP/RFQ), technical documentation, and vendor contracts.
- Oversee vendor deliverables and milestone adherence.
- Manage legacy system upgrades, data conversion, and rollout strategies.
- Facilitate Agile (Scrum/Kanban) and Waterfall project methodologies.
- Maintain project reporting within the State Touchdown system.
- Lead change management, training, and implementation readiness activities.
Required Qualifications
- 8+ years of project management experience in large, complex IT implementations.
- 8+ years managing system deployments with multiple stakeholders/clients.
- Strong experience managing scope, cost, schedule, risk, quality, and communications.
- 7+ years experience with Microsoft Project, MS Office, SharePoint.
- 5+ years experience in both Waterfall and Agile methodologies.
- 8+ years vendor management experience on large IT projects.
- Technical background in architecture/infrastructure, databases, and web-based technologies.
- Experience writing RFPs and technical documentation.
- Experience with Agile tools (ServiceNow, Jira, VersionOne, etc.).
- Strong leadership, communication, analytical, and stakeholder management skills.
Preferred Qualifications
- PMP Certification (required/preferred); Scrum Master certification a plus.
- Experience with State or Local Government projects.
- Public Health or Healthcare industry experience.
- Experience leading COTS/SaaS implementations.
- Experience with legacy system modernization and data migration.
- Strong SharePoint knowledge (sites, libraries, lists, groups).
- Experience managing Scrum teams and backlog prioritization.
Principal Demand Planning & Supply Chain Data Analytics Consultant Role
Seeking a hands-on advanced python, SQL, and full-stack data analytics developer with manufacturing and supply chain process impact experience.
Deep Technical Execution Skills
- Advanced Python + Snowflake SQL
- Snowflake architecture & migration across environments
- Interactive visualizations (PowerBI, SIGMA, Qliqview, or Streamlit)
- GitHub automation & CI/CD
- Cloud-native deployments
- Containerization
- API/service-account design
- TDD & rollback strategies
- Large-scale relational data wrangling
Business Process Fluency (Manufacturing / Supply Chain)
- ERP and MRP systems and data structures
- Understanding of operational workflows in manufacturing and supply chain (Order-to-cash, Procure-to-pay, Demand-to-deliver)
- Enterprise-scale process understanding
Communications and Impact
- Translating business requirements into technical delivery and technical issues into business impacts
- ROI-driven analytics
- Multi-geography experience
- Demonstrated business impact
JOB REQUIREMENTS
- Five or more years of professional data science experience in manufacturing and/or supply chain environments
- Hands-on experience working with ERP data and operational processes including order-to-cash, procure-to-pay, and demand-to-deliver
- Proven ability to communicate complex or technical concepts clearly to both technical and non-technical audiences
- Advanced proficiency in Python and SQL; experience wrangling large relational datasets
- Strong proficiency with Snowflake as a data platform
- Strong proficiency in dashboarding and interactive visualizations, including Power BI, SIGMA, Qliqview, or Streamlit
- Experience deploying enterprise-scale solutions in cloud environments
- Proficient in at least one data visualization library such as Matplotlib, Seaborn, or Plotly
- Familiarity with agile development best practices (model training and validation), CI/CD pipeline implementation
- Able to implement and manage GitHub repositories and automations, including developing automated github workflows to move python projects and Snowflake development (queries, procedures, data structures, and data) from dev to test to prod environments.
- Demonstrated ability to drive business impact and ROI through analytics
- Capable of executive-level communication and influence
- Experience leading cross-functional teams across multiple geographies
- Demonstrated ability to translate business requirements into technical projects that accomplish business objectives and achieve business value
- Experience developing and deploying scalable enterprise-grade containerized solutions, interactive front-ends, and back-ends using service accounts, APIs, safe handling of developer keys, and development best practices.
- Experience with test-drive development (TDD), deploying projects safety into production with roll-back options, and other common development best practices
Job Description:
· Expert-level knowledge of .NET Core, C#, MVC, Entity Framework, WCF, Web API and REST and SQL/Oracle databases.
· Working knowledge of JavaScript and jQuery.Should have knowledge in GraphQL besides RESTIntegrate internal services with external APIs using REST API.
· Develop and maintain ASP. NET Core Web APIs.
· Experience with leveraging AI tools or agentic AI to generate requirements, source code, or tests
· Experience in building ASP.Net Core Web applications using MVC or Razor pages.
· Experience in building applications using ASP.Net Core WebApi, GraphQ
· Experience in implementing SSO in ASP.Net Core Web and Web Api applications using OIDC or OAuth.
Role Overview
As a Data Engineer, you will be the technical backbone for our diverse portfolio of global clients. Unlike an in-house role, you will lead the migration, modernization, and optimization of data architectures across different industries. You are a problem-solver who can jump into a client’s messy legacy system and transform it into a high-performing, cloud-native data platform.
Key Service-Based Responsibilities
• Client Delivery: Lead the end-to-end implementation of data solutions, from initial discovery and requirements gathering to final deployment and handover.
• Cross-Platform Engineering: Build and maintain ETL/ELT pipelines across various cloud environments (AWS, Azure, and GCP) depending on the specific client’s ecosystem.
• Legacy Modernization: Help clients migrate from traditional on-premise databases (SQL Server, Oracle) to modern, scalable cloud data warehouses (Snowflake, BigQuery, Databricks).
• Data Strategy: Advise clients on best practices for data governance, security, and cost optimization within their data infrastructure.
• Technical Documentation: Create high-quality architectural diagrams and technical handbooks to ensure client teams can maintain the systems you build.
Technical Requirements
• Cloud Proficiency: Deep expertise in at least one major provider (AWS, Azure, or GCP), with certifications preferred.
• Data Warehousing: Advanced knowledge of Snowflake, Databricks, or Amazon Redshift.
• Code Mastery: Strong proficiency in Python and Advanced SQL (window functions, CTEs, performance tuning).
• Modern Data Stack: Experience with dbt (data build tool), Fivetran/Airbyte, and orchestration tools like Apache Airflow.
• DevOps/DataOps: Experience with Git, CI/CD pipelines, and Terraform/CloudFormation for Infrastructure as Code (IaC).
Preferred Qualifications
• Consulting Experience: Previous experience in a client-facing or agency environment.
• Industry Knowledge: Familiarity with data regulations (GDPR, HIPAA) relevant to different sectors like Finance or Healthcare.
Position Overview: The OSP Field Engineer performs field surveys and collects data to support fiber-optic and telecommunications engineering projects. This role documents existing aerial and underground utility conditions, validates routes, captures measurements, and ensures all information required for accurate design and permitting is collected.
Key Responsibilities
· Perform field walks to document poles, UG paths, utilities, ROW boundaries, and site conditions.
· Capture required photos: pole tags, heights, cable labels, MPOE, handholes, anchors, clearances, and obstructions.
· Measure spans, attachment heights, mid-span clearances, and accurately record GPS coordinates.
· Take detailed field notes following OSP Design Principles and maintain accurate route documentation.
· Convert field notes into Google Earth/KMZ format (preferred).
· Identify constructability issues, easements, conflicts, and permitting requirements.
· Upload and manage data using platforms such as Katapult Pro or IQGEO.
· Communicate findings with project managers, engineering teams, and other stakeholders.
Qualifications
· Experience in OSP fielding, telecom construction, or utility surveying (preferred).
· AT&T experience is preferred (IFP/ASE/GPON exposure a plus).
· Knowledge of Google Earth; ability to create or edit KMZ/KML files is a plus.
· Understanding of FTTx and AT&T GPON design is a plus.
· Must be legally authorized to work in the United States of America.
· Ability to work outdoors in all conditions and perform physically demanding field tasks.
· Valid driver’s license and reliable transportation.
Additional Requirements
· Flexible for short business trips and occasional weekend work (rare).
· Able to manage timelines efficiently while handling multiple projects simultaneously.
· Highly organized, capable of tracking deadlines, deliverables, and project expectations.
Skills
· Field surveying & measurement accuracy
· Photo documentation
· Map/GIS reading
· Problem-solving
· Time management & organization
· Knowledge of aerial/underground telecom infrastructure (preferred)
Role and Responsibilities:-
- Lead design on Financial Process / integration platform / patterns & Support build team
Ensure compliance of the data architecture with relevant financial regulations, accounting standards, and data privacy requirements
Design and implement data security controls, such as data encryption, access controls, and audit trails, across the Oracle Fusion Financials ERP ecosystem.
Collaborate with Security and Compliance teams to ensure adherence to industry standards and regulatory requirements across SEA for insurance industry
Support in definition of solution design, test plan, migration plan and execute data migration activities from legacy systems to Oracle Fusion Financials ERP
Collaborate with the integration specialists and technical experts to ensure seamless data flow and integration between Oracle Fusion Financials modules and other enterprise systems
Skills Required:-
8+ years of experience in designing and implementing data architectures for complex Oracle Fusion financial systems, preferably in the Oracle Subledgers (AP, AR, FA, Projects, Lease Accounting)
Strong expertise in Oracle Fusion Financials ERP Subledger modules and their underlying data structures including period close processes
Familiarity with financial accounting concepts and processes
Responsibilities:
Should have minimum 8-10+ years of experience
.
Good Communication and proactive.
Understand client and stakeholders requirements.
Do due diligence on the task and delivery with high quality on time.
Understand and align with the culture of the customer
Oracle E-Business Suite (EBS) Developer Technical
Experience in Data Collection Processing
String experience in Oracle
Experience in application maintenance and support.
12 Years of experience Required
About The Role
: We
are seeking a skilled Sr. Business Analyst with knowledge of healthcare Payer
& TPA industry and experience with premium billing, payments and
contribution accounting. This role involves requirements gathering, documenting
and analysis to align with the goals of the Program. We are seeking a
passionate, results-oriented candidate who excels in client interactions, and
can collaborate effectively with internal and external stakeholders to analyse
and document requirements for client or product. This person must possess a
blend of business and technical knowledge, experience in process mapping and
must possess strong communication skills.
Primary
Responsibilities:
Requirements
Gathering & Analysis: Collaborate with stakeholders, including business
units, technical teams, and clients, to gather and document detailed business
requirements and process flows for health insurance systems, applications, and
processes.
Solution
Design & Implementation: Work closely with technical teams to translate
business requirements into functional specifications, user stories, and system
designs. Serve as the primary point of contact for development teams to ensure
all requirements are fully understood and addressed in system designs and
solutions.
Testing
& Quality Assurance: Support the testing process by ensuring business
requirements are correctly implemented, validating system functionality, and
conducting user acceptance testing (UAT).
Experience
working and managing a small remote development team.
Essential
Qualifications
Experience:
10-15 years of experience as a Business Analyst in health insurance systems
and/or billing/payment systems.
Technical
Skills: Proficiency in MS Office and other business analysis tools (e.g.,
SQL/similar programming language, Microsoft Office Suite, Data Import/Export
tools).
Knowledge:
Deep understanding of the health insurance landscape, including Contribution
Accounting, Billing and Payments, and System Integrations. Ability to
document requirements and process
Analytical
Skills: Strong problem-solving abilities with a keen eye for detail and a
data-driven approach to analysis.
Communication:
Excellent written and verbal communication skills with the ability to present
complex concepts to both technical and non-technical stakeholders.
Collaboration:
Proven ability to work cross-functionally with diverse teams, including IT,
business operations, external partners, and remote teams.
Healthcare
Certifications: Any relevant certifications in health insurance, healthcare
analytics, or related areas would be an advantage.
Contribution
Accounting Experience: Familiarity with Contribution Accounting, billing and
payment systems is a plus.
Strong
attention to detail and organizational skills.
Ability
to manage multiple tasks and prioritize effectively.
with a commitment to delivering high-quality solutions.
Ability
to adapt quickly to changing requirements and priorities.
Job Description:
GCP Experience
• recent GCP experience
• Experience building data pipelines in GCP
• GCP Dataproc, GCS & BIGQuery experience
• hands-on experience with developing data warehouse solutions and data products.
•hands-on experience developing a distributed data processing platform with Hadoop, Hive or Spark, Airflow or a workflow orchestration solution are required
• hands-on experience in modeling and designing schema for data lakes or for RDBMS platforms.
• Experience with programming languages: Python, Java, Scala, etc.
• Experience with scripting languages: Perl, Shell, etc.
• Practice working with, processing, and managing large data sets (multi TB/PB scale).
• Exposure to test driven development and automated testing frameworks.
• Background in Scrum/Agile development methodologies.
• Capable of delivering on multiple competing priorities with little supervision.
• Excellent verbal and written communication skills.
• Bachelor's Degree in computer science or equivalent experience.
The most successful candidates will also have experience in the following:
• Gitflow
• Atlassian products – BitBucket, JIRA, Confluence etc.
• Continuous Integration tools such as Bamboo, Jenkins, or TFS
12+ years for experience required
Experience as an SAP Project Manager with a successful track record of managing complex S/4 HANA upgrade projects.
Develop a detailed project plan including timelines, milestones, resource allocation, and risk mitigation strategies.
Define project scope, considering business requirements and technical limitations.
Deep understanding of SAP S/4 HANA functionalities across various modules (FI, CO, MM, SD, etc.) and their impact on upgrade process.
Lead the technical team in performing system analysis, data migration, and configuration changes necessary for the S/4 HANA upgrade.
Expertise in project management methodologies (Agile, Waterfall) and tools.
Strong analytical and problem-solving skills to identify and resolve technical challenges during the upgrade.
Excellent communication and interpersonal skills to effectively collaborate with cross-functional teams and stakeholders.
Collaborate with stakeholders to gather functional requirements and prioritize features for upgrade.
Familiarity with data migration strategies and tools for SAP upgrades.
Knowledge of SAP best practices and industry standards related to S/4 HANA upgrade
Experience in Pharma and Life Science industry with CSV documentation is needed
Job Description:
We are looking for a passionate and experienced Site Reliability Engineer (SRE) with a strong focus on observability as part of our Data Center Exit Program. The ideal candidate will be responsible for building and maintaining reliable systems, ensuring high availability, and improving the performance of our infrastructure. You will play a critical role in designing, implementing, and managing observability solutions to provide deep insights into systems and applications.
Key Responsibilities:
Design, implement, and maintain observability solutions, including monitoring, logging, and tracing systems.
Develop and maintain dashboards, alerts, and reports to monitor system performance and health.
Collaborate with development and operations teams to integrate observability into the software development lifecycle.
Troubleshoot and resolve issues related to system performance, reliability, and scalability.
Automate repetitive tasks and processes to improve efficiency and reduce manual intervention.
Continuously evaluate and improve observability tools and practices to ensure they meet organizational needs.
Document observability processes, best practices, and guidelines for the team.
Qualifications:
Bachelors degree in Computer Science, Engineering, or a related field, or equivalent experience.
5+ years of experience as a Site Reliability Engineer or in a similar role with a focus on observability.
Strong experience with Java J2EE and Microservices.
In-depth knowledge of observability tools and technologies, including Prometheus, Grafana, ELK Stack, Jaeger, etc.
Proficiency in scripting and automation using languages such as Python, Bash, or similar.
Experience with cloud platforms (GCP, Azure) and containerization technologies (Docker, Kubernetes).
Excellent problem-solving skills and the ability to work under pressure.
Strong communication and collaboration skills.
Preferred Qualifications:
Experience with Infrastructure as Code (IaC) tools such as Terraform or Ansible.
Familiarity with CI/CD pipelines and DevOps practices.
Knowledge of security best practices and compliance requirements.
Develop and implement comprehensive test strategies and plans.
Define test objectives, scope, and criteria for success.
Coordinate with project managers and development teams to ensure alignment with project goals.
Test Execution and Management:
Oversee the execution of manual and automated tests.
Ensure thorough regression testing is conducted to maintain software quality.
Manage test environments and ensure they are properly configured.
Automation and Tools:
Lead the development and maintenance of automated test scripts using Selenium and Python.
Identify opportunities to improve test automation coverage and efficiency.
Integrate automated tests into CI/CD pipelines.
Quality Assurance:
Monitor and report on test progress, defects, and overall quality metrics.
Collaborate with stakeholders to address and resolve issues promptly.
Retail Sector Focus:
Apply domain knowledge of the retail industry to ensure relevant and effective testing.
Understand retail-specific requirements and challenges to tailor testing approaches.
Strong knowledge of testing methodologies, tools, and automation frameworks.
Experience with web test automation tools such as Selenium
Familiarity with Agile development and DevOps testing environments.
Knowledge of CI/CD tools like Jenkins, Git, or Azure DevOps is a plus.
Knowledge of performance testing and load testing tools like JMeter.
Knowledge of scripting languages such as Python for web test automation
Knowledge of API testing tools like Postman or REST Assured.
Java, or JavaScript for test automation.
Experience with mobile test automation tools such as Appium
Mandatory
Skill sets
Technology expectations
o 5 + years of experience in Hadoop/Big Data
o 2+ years of experience in strategic data
planning, standards, procedures, and governance
o 4+ years of hands-on experience in Python or Scala
o 3+ years of experience in writing and tuning
SQLs, Spark queries
o Experience in understanding and managing
Hadoop Log Files, Hadoop multiple data processing engines.
o Experience in analysing data in HDFS through
Map Reduce, Hive and Pig
o Experience with scripting languages: Python,
Scala, etc.
o 3+ experience in GCP environment
Other
required skills
o Strong
verbal and written communication skills to effectively share findings with
shareholders.
o Good
understanding of web-based application development
o Should
be an independent contributor from day one.
o A
bachelors degree in computer science or relevant fields is mandatory.
o Should
be hands on and be able to work in an agile, fast moving environment.
