Microsoft .NET web Developer - WI

 Location : Madison, WI

Duration    : 12+ Months

General Competencies
Ability to work independently utilizing critical thinking and discretionary decision-making skill sets. - 5 years
Information Technology
Analyze requirements, architect solutions, implement and test final technical solutions - 5 (5 - Expert) of 5 (5 -Expert) and 5 years
Analyze software and design information flow; make recommendations as necessary - 5 (5 - Expert) of 5 (5 -Expert)and 5 years
Information Technology - Application Development
Application Development and design experience using Visual Studio 2015-2017 - 5 (5 - Expert) of 5 (5 -Expert) and 3 years
Information Technology - Applications
Application Development and Support of IIS/ASP.Net/MVC/DB2/Oracle Server Based Systems Using Visual Studio - 5 years
Information Technology - Architecture
.Net Application and Web Application Design Patterns - 5 years
Information Technology - Databases
Database design -5 (5 - Expert) of 5 (5 -Expert) and 3 years
Information Technology - Languages/Tools
ASP.NET MVC, C#, Web API - 5 years
ASP.Net Web forms – VB.Net and C# - 5 years
Information Technology - Operating Systems
Team Foundation Server (TFS) - 3 years
Information Technology - Requirements
Analysis and Design of Applications- 5 (5 - Expert) of 5 (5 -Expert) and 5 years
Information Technology - Testing
Ability to perform testing of applications, facilitate testing by others, document results and facilitate efforts to uncover and fix issues found during testing and 5 years
Develop plans for acceptance testing - 3 (3 - Moderately Strong) of 5 (5 - Expert) and 3 years
Develop test plan/procedures and training documentation - 3 years
Perform software bug tracking - 5 (5 - Expert) of 5 (5 –Expert) and 5 years

 kennethm@cci-worldwide.com

Continue Reading

Microsoft Power BI Developer - WA

Location: WA – Remote for Now.

6 Months+

 

Job Responsibilities:

 

Required Skills:

•            Microsoft Power BI Analytics - Data Visualization

Responsibilities:             

•            Sustain and enhance visualization tools to data reporting using Power BI; document processes and train client staff on Power BI and processes

•            Provide informal training to support surveillance staff and improve their knowledge on, Power BI, data quality assurance, data management, and data visualization


Raman Kumar Mahto
Phone/Text: 916 605 4633

raman.mahto@agreeya.com

Continue Reading

Big Data Engineer at Orlando , FL

Location: Orlando , FL

Duration: 6 Months

Client: Happiest Mind

 

Responsibilities:
Build cool things – Build scalable analytics solution, including data processing, storage, and serving large-scale data through batch and stream, analytics for both behavioral & ad revenue through digital & non-digital channels.
Harness curiosity – Change the way how we think, act, and utilize our data by performing exploratory and quantitative analytics, data mining, and discovery.
Innovate and inspire – Think of new ways to help make our data platform more scalable, resilient and reliable and then work across our team to put your ideas into action.
Think at scale - Lead the transformation of a peta-byte scale batch based processing platform to a near real-time streaming platform using technologies such as Apache Kafka, Cassandra, Spark and other open source frameworks.
Have pride – Ensure performance isn't our weakness by implementing and refining robust data processing using Python, Java, Scala and other database technologies such as RedShift or Snowflake .
Grow with us – Help us stay ahead of the curve by working closely with data architects, stream processing specialists, API developers, our DevOps team, and analysts to design systems which can scale elastically in ways which make other groups jealous.
Lead and coach – Mentor other software engineers by developing re-usable frameworks. Review design and code produced by other engineers.
ML First - Provide expert level advice to data scientists, data engineers, and operations to deliver high quality analytics via machine learning and deep learning via data pipelines and APIs.
Build and Support – Embrace the DevOps mentality to build, deploy and support applications in cloud with minimal help from other teams
 
Required Qualifications:
2+ years of development experience in in Key-Value store databases like DynamoDB, Cassandra, ScyllaDB etc.
 
AND/OR
 
2+ years of development experience in Graph Databases like AWS Neptune, Neo4J, JanusGraph etc.
 
Basic Qualifications:

Not your first rodeo – Have 4+ years of experience developing data driven application using mix of languages (Java, Scala, Python, SQL etc.) and open source frameworks to implement data ingest, processing, and analytics technologies.
Data and API ninja –You are also very handy with big data framework such as Hadoop, Apache Spark, No-SQL systems such as Cassandra or DynamoDB, Streaming technologies such as Apache Kafka; Understand reactive programming and dependency injection such as Spring to develop REST services.
Have a technology toolbox – Hands on experience with newer technologies relevant to the data space such as Spark, Airflow, Apache Druid, Snowflake (or any other OLAP databases).
Cloud First - Plenty of experience with developing and deploying in a cloud native environment preferably AWS cloud


ldhir@metasysinc.com

470-571-1269

Continue Reading

SSAS Developer : Kaiser Permanente Pleasanton, CA

 Location- Pleasanton, CA

Please contact Aparna Ghosh at (925) 627 4984 or email aparna@ascentsg.com

An ideal candidate should have excellent communication skills, solid programming and problem-solving skills, and demonstrated experience in Power BI visualizations and data modeling.

  • Understand business problems and suggest data and analytic solutions to support decision makers.
  • Help develop and transform data sets using variety of programming languages.
  • Develop and test analytic hypotheses using established methods.
  • Create data models and visualizations in Power BI that convey a clear story about what is observed in the data.
  • Communicate results to variety of audiences.

 
Qualifications:

  • 2-4 in Dax data modeling and Power BI visualizations
  • 2-4 years’ experience writing SQL like queries to various databases (ie. T-SQL).
  • Bachelor’s degree with evidence of quantitative aptitude.
  • 2-4 years doing analysis, model building, causal inference.

 

Preferred Experience

  • Experience with SSAS Data models
  • 2-4 years’ work experience is preferred.
  • Bachelors or master’s degree in computer science, statistics, data science, mathematics, engineering or other quantitative field in preferred
  • Analytic or operational experience and / or education in Pharmacy, Nursing, Public Health, or other relevant medicine topics is appreciated.
Continue Reading

DB2 UDB/Linux DBA - O'fallon, MO

 Migrate databases from Mainframe to LUW v11.2, Should have in-depth knowledge on mainframe as well as LUW.

• Install DB2 v11 and DB2 connect versions and set up 32 bit /64 bit ODBC connectivity

• Set up 4 way High availability and maintain production databases

• Perform code conversion and data compatibility between mainframe and LUW and design LUW databases according to mainframe standards.

• Perform multiple table/tablespace/database restoration at any point in time without impacting any other table/tablespace.

• Design database , tablespaces, buffer pools, referential constraints from mainframe to LUW standards without impacting the overall performance.

• Expertise in setting up Splunk dashboard, alerting mechanism to generate tickets.

• Able to configure and set up QMA and Tivoli system for automation.

• Ability to configure and set up db2 native encryption and LDAP and have users connect databases through LDAP.

• Ability to set up db2 audit facility to syslog at table level/ db level for specific operations.

• Ability to understand DSNUTIL, modify mainframe jobs to run on LUW without data issues.

• Fix data incompatibility issues from mainframe to LUW and ensure no performance issues occurs.

• Knowledge on Cobol programs, JCL , Mainframe load/unload is must.

• Ability to explain all stake holders on LUW and Mainframe design aspects and fix issues as and when they arise.

• Ability to show case LUW database availability/recoverability during hard failover, soft failover and Disaster recovery scenarios.

• Upgrade/ apply patches as needed.

• Able to write shell scripts to automate failover alerts for databases, set up monitoring on various aspects like failover, space, CPU, memory, log full, deadlocks, db2 instance down, prolonged reorgs, runstats.. etc

• Should be able to handle queries /changes/ technical challenges as they arose during this project without escalations.

• Should be able to performance tune databases and analyze explain plans in detail and recommend changes to Stake holders with supporting proofs.

• Able to create POC based on unique requirements and drive towards resolutions.

• In depth knowledge and experience on db2INGEST.

Continue Reading

Featured post

OPT Recruiter

  An OPT (Optional Practical Training) recruiter in the U.S. specializes in helping international students with F-1 visas who are seeking pr...