Have Any Questions?
Call Now (425)-280-3059


1. Software Developer
2. Informatica Developer
3. QA Engineer
4. Sr. Data Analyst

Software Developer
Location: TX
Education: Bachelor’s degree in Computer Science, Computer Applications.

Job Responsibilities:

Configuring new RPA processes and objects using core workflow principles that are efficient, well structured, maintainable and easy to understand using Automation Anywhere.
Building Software robots to automate end to end business processes to reduce manual intervention.
Managing Dashboard, Source Control and Licenses and handling Control room and Bot farm.
Order processing, data entry and generating automatic Reports.
Providing technical assistance to the bots that are already existed and monitoring the servers on daily basis.
Perform Deployments for Upgrades, and supporting the Operational Teams during the rollout phases.
Create and execute unit code tests, document code fixes for all created code to memorialize fixes and dependencies according to established standards.

Send resume & specify position you are seeking to:

Informatica Developer
Location: CA
Education: Bachelor’s degree in Computer Science, Computer Applications, or a related Engineering discipline, or equivalent.

Job Responsibilities:

Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications.

Transform to create work flow Model metadata, cubes, and maps as per business requirements. The workflow that is developed in lower environment is moved to production system after quality testing approvals from business teams.

Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.

Using Informatica Power Center Designer analyzed the source data to Extract & Transform from various source systems (oracle 10g, DB2, SQL server and flat files) by incorporating business rules using different objects and functions that the tool supports.

Design and develop mappings using Source Qualifier, Expression, Lookup, Router, Aggregator, Filter, Sequence Generator, Stored Procedure, Update Strategy, joiner and Rank transformations. Managed the Metadata associated with the ETL processes used to populate the Data Warehouse.

Data modeling using Erwin, Star Schema Modeling, and Snowflake modeling, FACT and Dimensions tables, physical and logical modeling.

Interpret logical and physical data models for business users to determine common data definitions and establish referential integrity of the system using ER-STUDIO.

Work with Functional team to make sure required data has been extracted and loaded and performed the Unit Testing and fixed the errors to meet the requirements.

Copied/Exported/Imported the mappings/sessions/ worklets /workflows from development to Test Repository and promoted to Production.

Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.

Work with Static, Dynamic and Persistent Cache in lookup transformation for better throughput of Sessions. Used PMCMD command to automate the Power Center sessions and workflows through UNIX.

Extract data from oracle database and spreadsheets and staged into a single place and applied business logic to load them in the central oracle database.

Prepare codes for all modules according to require specification and client requirements and prepare all required test plans.

Create reports from the jobs scheduled for audit purposes and sent out the internal audit teams to make sure that the reports are Compliant and monitoring compliance run results to the expected business owners as per daily, weekly and monthly process.

Prepare UNIX Shell Scripts and these shell scripts will be scheduled in AUTOSYS for automatic execution at the specific timings.

Prepare test Scenarios and Test cases in HP Quality Center and involved in unit testing of mappings, system testing and user acceptance testing.

Send resume & specify position you are seeking to:

QA Engineer
Location: Remote (Until COVID-19) and Burr Bridge, IL
6+ years of QA experience

Job Responsibilities:

Own the Software QA process, including developing, implementing, and maintaining test plans, test scenarios and test cases.

Analyze requirement and design specifications for test case development.

Recommend test automation approach, tools, and framework.

Develop test infrastructure and custom automation tools as needed to expand test coverage and enable non-functional testing.

Perform manual and automated tests for our website and applications.

Prioritize test execution.

Find and report defects with detailed, accurate and concise steps to reproduce.

Assist developers in discovering and researching defects and recommend system enhancements.

Complete ownership for all testing across multiple applications build with a varied set of tools/technologies.

Hands on testing that includes analyzing requirements, preparing test plans, and building appropriate test cases to validate the functionality being built.

Support and execution of the application testing phase (functional & non-functional) to ensure all software meets requirements before changes are placed in production.

Liaison/co-ordination with other technology groups (across sites) to coordinate/execute end-end testing.

Drive all efforts on test automation, includes planning, hands-on scripting and oversight of other resources working on automation.

Education :Bachelor’s in Computer Science, or related field or equivalent experience

Send resume & specify position you are seeking to:

Sr. Data Analyst
Location: TX 75024
Educational Qualifications: Bachelor Degree
Posted date: Feb 2, 2021

Job Responsibilities:

• Developing and working on AI powered ability tool to quickly generate hypothesis of performance drivers and identify performance anomalies
• Worked on data pipelines in loading the data from various sources into our snowflake tables using python ETL process and AWS data pipelines.
• Design and developing controls which will perform suppressions on the data for marketing needs.
• Developing data driven documents for visualization using D3.js to report the data generated for analytical purposes.
• Developing a python process module to query the data using snowflake and manipulate the data using pandas library.
• Design and developed data control platform which runs on node.js server for reporting the analyzed data which were reported as exceptions for the analytical purposes
• Working on creating data pipelines in AWS for extracting the data from various sources and loading into our Snowflake warehouses using python and pyspark
• Design and develop Enterprise applications using different technologies Spark, Python, SQL, AWS services.
• Develop framework using shell script, Spark, Python, SQL for moving the data from Filtered data after quality check to Snowflake data warehouse on Amazon cloud which is leveraged by various data migration teams.
• Extract data from various source systems using Python and landing on AWS S3.
• Work on Cloud Formation templates for building 2-tier and 3-tier applications.
• Build servers using EC2, auto-scaling, load balances (ELB’s) and security groups.
• Develop web service on the Redshift data warehouse using Flask framework which served as a backend for the real-time dashboard.
• Handle performance of the application by optimizing spark partitions, Cache effectively to improve the performance in Amazon EMR
• Develop Metadata Extraction and Data quality automation tools and scheduling the jobs Unix shell scripting, Python, REST API
• Conduct unit and integration testing, coordinating UAT and Production releases.
• Responsible for Continuous Integration (CI) and Continuous Delivery (CD) processes implemented using Jenkins along with scripts to automate routine jobs to speed up deployments.
• Developing various python and pyspark scripts in process of data transformations to adjust our needs and customize the data formats.
• Developing the data pipelines in transferring the data into various layers in snowflake which undergoes several transformations using python scripts
• Create SQL queries for CRUD (Create, Read, update, and delete) operations on the Snowflake database using python and SQL.
• Designing and developing the safety controls to ensure the enterprise guidelines were followed in the use of data using python scripts in the applications
• Developing new features based on the requirements gathered and enhancing the application suitable to adjust the new additions.
• Developing new interface additions to accommodate the new data layer additions using D3.js in frontend

Send resume & specify position you are seeking to: