u千赢|国际娱乐老虎机

                      Top Skills > Datawarehousing

                      Datawarehousing ETL Jobs in USA - 2144

                      NO matched jobs found.
                      Click here for a custom search.
                      Senior Data Analyst  San Francisco, CA
                      ($) : Market
                      Senior Data Analyst SFO, CA 12 Months Contract C2C/W2/1099 *Need Local Consultants Only* Requires 10+years of progressive experience in developing and supporting data and reporting projects. Requires 7+ years of in-depth experience using ETL tools like SSIS, SnowSQL, Informatica and BI tools including SSRS, Power BI, Tableau etc. Requires 2+ years of using Snowflake capabilities - Snow SQL, Tasks, Streams, Time travel, Data sharing and stored procedures. Proficiency in one of the scripting languages such as Java or Python is preferred. Experience with Reporting and Dashboards including BI data modeling and architecture in the financial services industry is preferred.
                      Jun-08-20
                      Title: Scientific Analyst Location: Princeton, NJ 08543 Duration: 6?months Comments: Strong proficiency in Data Visualization tools (Spotfire, Tableau, PowerBi etc and specifically able to create, manage and evolve a PowerBi dataset Deep experience in MS Excel including the ability to create Macros & use VBA, formulas and data merging/comparison techniques knowledge of the data model, expert knowledge of pivot tables and pivot charts and general understanding of coding practices Strong proficiency in PowerQuery and related dataset transformations Experience with leveraging Excel files, SharePoint lists, and other data sources to populate & keep updated a PowerBi dataset Experience leveraging SharePoint lists & libraries for the purpose of collecting & aggregating data (including InfoPath & Forms) in addition to the ability to customize the site’s web parts and pages Job Description: This role serves as a reporting analyst / data scientist to develop, merge, compare, update and report through data visualization on overlapping or disparate data sets. The analyst will need to have strong data analytic abilities, be able to solicit requirements from end users and understand how to keep data and priorities organized in order to provide clear insights for the current business need. Required Skills & Experience: Strong proficiency in MS Excel, Word, PowerPoint, Outlook Deep experience in MS Excel including the ability to create Macros & use VBA, formulas and data merging/comparison techniques knowledge of the data model, expert knowledge of pivot tables and pivot charts and general understanding of coding practices Strong proficiency in PowerQuery and related dataset transformations Experience with leveraging Excel files, SharePoint lists, and other data sources to populate & keep updated a PowerBi dataset Experience leveraging SharePoint lists & libraries for the purpose of collecting & aggregating data (including InfoPath & Forms) in addition to the ability to customize the site’s web parts and pages Strong proficiency in Data Visualization tools (Spotfire, Tableau, PowerBi etc and specifically able to create, manage and evolve a PowerBi dataset SQL or related database experience required (e.g., ability to query tables, SQL reporting services, etc Workflow experience required (SharePoint designer, Microsoft Flow and related tools) Facilitation and meeting management skills; Strong communicator via all media: collaboration technologies (SharePoint, Skype, Microsoft Teams etc and presentations written, emails, verbal – with a wide variety of audiences Manages multiple sometimes conflicting priorities, challenges the status quo, and exercises diplomacy all while maintaining a high comfort level for ambiguity Experience managing, configuring or heavily using enterprise commercial Project & Portfolio management & collaboration tools a plus At least 5 years of prior experience working in a Business or Data Analyst capacity; ideally in the Pharmaceutical Industry Proven ability in gathering, tracing, translating and managing complex requirements, business rules, and data Excellent oral and written communication skills including technical writing / documentation; organizes and presents ideas in a convincing and compelling manner Demonstrated ability to use structured problem solving and available tools to quickly evaluate problems, identify root causes, create action plans, assess impact and develop resolution options Action oriented and demonstrates accountability for results and acts with integrity and develops credibility Demonstrates strong analytical skills, process knowledge and lateral thinking. Metric/KPI capture, analysis, and reporting Exceptional interpersonal skills; able to communicate effectively with both technical and non-technical teams; able to provide technical leadership Demonstrate ability to be teachable, a quick-learner, adapt in a fast-paced environment and work independently when needed Excellent teamwork and interpersonal skills, with the ability to communicate and collaborate with employees and management at all levels. Applications: ServiceNow Excel & PowerQuery Spotfire Tableau PowerBi SharePoint & SharePoint Designer Microsoft Flow
                      Jun-08-20
                      ($) : Market
                      Responsibilities: Software Development: o Programming complex concepts using PL/SQL, Unix and Core Java Process Compliant: o In following industry standard coding practices. o In Documenting artifacts. o Follow Agile methodologies: o Understanding user stories tied to the backlog item/s at hand o Ensure meaningful conversion of the user story features into high-quality working code. Mentoring: o Determine methods and procedures on new assignments. o Coordinate activities with other employees. o Review and guide junior team members. Leading: o Periodic Project review with internal stakeholders. o Provide continuous feedback to team. o Improve productivity. o Ensure quality deliverable. Contributing to your practice/BU by o Documenting your learnings from the current work o Produce knowledge sharing artifact such as an internal Wiki, a blog or a best practices document. Contribute to open source projects. o Keep updated on the latest technologies with technology trainings and certifications o Actively participating in BU/Practice level activities and events related to learning, formal training, interviewing, special projects, POC etc. o Lead team members Qualifications Required: 5-9 years of relevant experience Bachelor’s degree - Computer Science, Information Technology, Electronics Experience in leading and mentoring Create a Design document and unit test case. On job training on market-leading MDM technologies from Architect and Lead Developers. Work with Informatica MDM Architect and Lead Developer to build a world-class MDM solution. Design real-time MDM solutions where MDM is accessed and leveraged for operational use cases by spoke systems Opportunity to participate in Design & Build to implement the Data governance workflows Collaborating with enterprise-wide technical team members to build an integrated MDM solution Understanding of the MDM Services Integration Framework (SIF) Understanding of the data model and subject areas in MDM/IDD Understand User Exits, how to add them to the hub and configure the MDM Model Defines and manages business entities, reference entities The candidate will work with remote teams to collaborate on any master data initiatives Should have done at least two implementation/support/Upgrade/Rollout projects in Informatica MDM Authoring creation of detailed design documents. Exceptional written and verbal communication skills. Excellent analytical ability and critical thinking skills. Preferred Master’s degree - Computer Science, Information Technology, Electronics Knowledge in object-oriented concepts, SOA principles. Informatica MDM Certified Professional Zeal to quickly learn and adapt to other technologies
                      Jun-08-20
                      Job Title : Data Modeler/Architect Client Company : VDOT City : Richmond State : VA Duration : 12+ months Job Description: Only looking for local candidates (Kindly only submit candidates local to VA) Note* Candidates will telework now (after coming to office to pick up laptop Once restrictions are lifted, candidates will be required to work onsite daily M-F 8-5pm interviews will be conducted via Skype or Google Hangout Responsibilities Create data models and xml schemas at all levels including conceptual, logical, and physical for both relational and dimensional solutions. Models will include objects, entities, attributes, their inter-relationships and dependencies in 3N and/or dimensional formats. Integrate disparate data models into coherent enterprise data models Forward engineer physical models to create DDL scripts to implement new databases or database changes. Reverse engineer databases and synchronize data models with actual data implementations. Create data dictionaries and business glossaries to document data lineages, data definitions and metadata for all business-critical data domains. Identify and reconcile inconsistencies in data definitions, e.g., synonyms and homonyms. Work to identify master data (entities and attributes) and capture how data is interpreted by users in various parts of theorganization. Capture business rules that govern how data is transformed, integrated, and used. Develop canonical models and data transformation rules using XML Skill Matrix Table/Requirements Skills Candidate''s Experience in years Broad understanding of Data Architecture and Data Management approaches and implementation methodologies Ability to work with business as well as IT stakeholders Minimum of 5+ years of experience designing large scale data models for functional, operational and analytical environments Demonstrated experience in Conceptual, Logical, Physical and Dimensional Modeling Hands-on experience with data modeling tools Knowledge of BI tools, such as Power BI, Tableau Experience with Agile/Scrum is valuable Ability to work creatively and analytically in a team environment Excellent communication and documentation skills 5+ years of experience as Data Modeler/Data Architect creating data models and working in Data Architecture and Data Management solutions.
                      Jun-08-20
                      ($) : Open
                      Data Modeler/Architect Location: Richmond VA 23219 Duration: 12 Months Required Skills : Please follow the skill matrix table attached in JD Only looking for local candidates (Kindly only submit candidates local to VA) Any Visa Status (Please make sure candidates visa is not expiring within the contract duration) Responsibilities Create data models and xml schemas at all levels including conceptual, logical, and physical for both relational and dimensional solutions. Models will include objects, entities, attributes, their inter-relationships and dependencies in 3N and/or dimensional formats. Integrate disparate data models into coherent enterprise data models Forward engineer physical models to create DDL scripts to implement new databases or database changes. Reverse engineer databases and synchronize data models with actual data implementations. Create data dictionaries and business glossaries to document data lineages, data definitions and metadata for all business-critical data domains. Identify and reconcile inconsistencies in data definitions, e.g., synonyms and homonyms. Work to identify master data (entities and attributes) and capture how data is interpreted by users in various parts of theorganization. Capture business rules that govern how data is transformed, integrated, and used. Develop canonical models and data transformation rules using XML Skill Matrix Table/Requirements Skills Candidate''s Experience in years Broad understanding of Data Architecture and Data Management approaches and implementation methodologies Ability to work with business as well as IT stakeholders Minimum of 5+ years of experience designing large scale data models for functional, operational and analytical environments Demonstrated experience in Conceptual, Logical, Physical and Dimensional Modeling Hands-on experience with data modeling tools Knowledge of BI tools, such as Power BI, Tableau Experience with Agile/Scrum is valuable Ability to work creatively and analytically in a team environment Excellent communication and documentation skills 5+ years of experience as Data Modeler/Data Architect creating data models and working in Data Architecture and Data Management solutions
                      Jun-08-20
                      Cliecon Solutions Inc is hiring Lead ETL Developer for our client in Clinton,NJ.Please contact . Lead ETL Developer Support of the application and the business. Vendor and in-house projects Hands on with development and mentorship. The nature of the business being worked with high visibility, great application to gain experience with Top skills: 8+ years of ETL experience in tools like Informatica, Talend, SSIS 2 years of Lead experience Day to day responsibilities: Design and develop customized solutions for the team to support critical business functions, meet project objectives and company goals Oversee and provide technical solutions design and delivery, including integration with existing architecture. Troubleshoot and support existing application stack. Code, tests, debugs, documents and implements complex software projects using ETL, Core Java and other proprietary technologies Mentor developers and lead the application development team for on time delivery
                      Jun-08-20
                      ($) : Depends on Experience
                      Hi, Please go through below requirement and let me know your interest. Job Title: Sr. Informatica Administrator (Axon, EDC, IICS mandatory all these 3 skills) Location: NYC,NY 75% remote & 25% travel to client location Expenses covered Job Overview: As a Lead Technical Consultant you will participate in all aspects of the software development lifecycle which includes estimating, technical design, implementation, documentation, testing, deployment and support of application developed for our clients. As a member working in a team environment you will work with solution architects and developers on interpretation/translation of wireframes and creative designs into functional requirements, and subsequently into technical design. Responsibilities: Informatica Intelligent Cloud Services (IICS) Administration seasoned in importing/exporting services under deployments. Informatica Axon Data Governance and Enterprise Data Catalog (EDC) Implementation and Administration Qualifications: 5+ years of experience working on the Informatica platform, including but not limited to the following: o Informatica Intelligent Cloud Services (IICS) Administration seasoned in importing/exporting services under deployments. o Informatica Axon Data Governance Administration experience o Informatica Enterprise Data Catalog (EDC) Administration experience o Well-versed with all Informatica Client Components Client facing or consulting experience required. Skilled problem solvers with the desire and proven ability to create innovative solutions. Flexible and adaptable attitude, disciplined to manage multiple responsibilities and adjust to varied environments. Future technology leaders- dynamic individuals energized by fast paced personal and professional growth. Phenomenal communicators who can explain and present concepts to technical and non-technical audiences alike, including high level decision makers. Bachelor’s Degree in MIS, Computer Science, Math, Engineering or comparable major. Solid foundation in Computer Science, with strong competencies in data structures, algorithms and software design. Knowledge and experience in developing software using agile methodologies. Proficient in authoring, editing and presenting technical documents. Ability to communicate effectively via multiple channels (verbal, written, etc with technical and non-technical staff.
                      Jun-08-20
                      ($) : USD 80 / Hourly / C2C
                      Job Title: ETL Lead Location: Pleasanton CA 94588 Duration: Long Term Employment: Contractual Rate: $Negotiable Primary Skills: Informatica PowerCenter Extract Transform Load Onsite ETL lead (Informatica, Oracle experience, SQL) - Informatica Developer to setup ETL on a complex systems with Oracle RDBMS as the back-end. Expertise in Informatica Powercenter and Oracle RDBMS
                      Jun-08-20
                      ($) : USD 73 / Hourly / C2C
                      Job Title: Informatica Lead Location: Boston, MA Duration: Contract Job Description: Build and troubleshoot complicated ETL/ELT data flows Establish and socialize Informatica best practices Interpret business requirements, identify & recommend opportunities, enable and implement where required. Influence strategic direction of new data extraction capabilities. Enable and maintain Informatica SSO authentication (via OKTA, AD, etc), folder level security and self-service capabilities Identify, mentor and develop high performing information technology professionals. Proof of concept new Informatica modules and apply them to Enterprise use cases Minimum Education & Experience Requirements: Bachelor's or Masters (preferred) degree in Computer Science or Engineering 8+ years of technology experience with progressively increasing responsibilities. 3+ years of working experience managing direct reports or influencing technologists in a matrixed organization Deep and proven expertise of iPaaS platforms such as Informatica (preferred), Talend, Dell Boomi, MuleSoft, Microsoft Data Factory, Oracle, SAP or IBM Previous experience of enterprise tool deployment in a highly regulated environment (FDA, GxP, HIPAA, SOX, GDPR and other data privacy) Knowledge/Skills Needed: Experience designing and troubleshooting complicated data extraction routines and real time / batch integrations Experience extracting and processing data from multiple data sources such as: Application (SAP, Salesforce, Marketo), Web Services (SOAP/REST), Databases (Oracle, SQL Server, Neteeza, Vertica, Snowflake), flat files .csv, txt, xml, json) and streaming data (instruments, sensors, iot devices) Experience working in cloud native computing environments (AWS, Azure, GCP) Exposure to modern cloud native database platforms and architectures (Snowflake (preferred), Azure or Redshift) Exposure to version control (GitLab, GitHub, BitBucket) and CI/CD (GitHub, Jenkins, etc) tools and methodologies Experience writing and troubleshooting SQL queries Exposure to Data Warehousing concepts and principles Excellent verbal and written communication skills, including the ability to explain technical concepts and technologies to business leaders
                      Jun-08-20
                      ($) : USD Negotiable
                      Datastage Lead Location: Detroit, MI Contract At least 8 years experiencein working hands-on with IBM InfoSphere Datastage (must include at least version 8.5, if not 8.7 2+ Full cycle Datastage implementations at-least one large scale implementation with role of Datastage module lead Primary work experience (at least 9 years cumulative) in IBM Information Management products like Information Analyzer, DataStage, QualityStage, Metadata Workbench, Business Glossary In depth experience in IBM InfoSphere suite of products Hands-on working experience on orchestration using Autosys, Control-M, etc. In-depth experience in handling RT/NRT high-availability ETL design Extensive experience in performance testing for ETL Sound database knowledge on DB2 and/or Oracle, SQL Server, etc. including procedural programming Extensive Knowledge on the Information Management Landscape, tools, technologies and Information Management Frameworks Must have been through several full life cycle Data Warehousing implementations and involved in scalability and performance related design aspects in Database, ETL and Reporting Experienced in working development, maintenance & production support projects Experienced in creating reusable components, automations and error handling strategies Should be able to do detailed requirements gathering/elicitation, able to conduct requirements gathering workshops with client. Should be able to articulate and create functional design document Should have good understanding of Datastage Architecture Should be able to independently lead offshore engagements Experience with supporting pre-sales, leading solution design, proposal presentations Should have good customer engagement and team leading skills Experience in Cognos BI (Framework Manager, Report Studio, Query Studio), Metric Studio, Analysis Studio and TM1 Very good experience in databases like SQL Server, Oracle and other RBDMS. Sound knowledge and experience in design and development of reports, scorecards and dashboards Able to design cubes in TM1 and/or Transformer and integrate with Cognos BI Worked in report migration projects like Crystal Reports to Cognos or Business Objects to Cognos Must have good knowledge on Data warehouse, Data modeling and ETL concepts Detail requirements gathering/elicitation and conduct requirements gathering workshops with client Excellent understanding on Cognos Architecture Experience in supporting pre-sales, leading solution design and proposal presentations Independently lead consulting engagements Articulate and create functional design document
                      Jun-08-20
                      Data Architect  San Francisco, CA
                      We are looking for Data Architect in San Francisco, CA Role: Data Architect Location: San Francisco, CA Duration: 12+ months ? Minimum: Google Professional Data Engineer, Google Professional Cloud Architect ? Preferred: AWS Big Data Speciality Ceication ? 20+ years direct experience working in Enterprise Data Warehouse technologies ? 10+ years in a customer facing role working with enterprise clients ? Experience with architecting, implementing and/or maintaining technical solutions in viualized environments. ? Experience in design, architecture and implementation of Data warehouses, data pipelines and ows. ? Experience with reading soware code in one or more languages such as Java, Python and SQL. ? Experience designing and deploying large scale distributed data processing systems with few technologies such as Oracle, MS SQL Server, MySQL, PostgreSQL, MongoDB, Cassandra, Redis, Hadoop, Spark, HBase, Veica, Netezza, Teradata, Tableau, Qlik or MicroStrategy. ? Customer facing migration experience, including service discovery, assessment, planning, execution, and operations. ? Demonstrated excellent communication, presentation, and problem solving skills. ? Experience in project governance and enterprise customer management.
                      Jun-08-20
                      We are looking for Data Engineer in San Francisco, CA Role: Data Engineer Location: San Francisco, CA Duration: Long term contract Job description ? Certifications Minimum: Google Professional Cloud Architect ? Preferred Google Professional Data Engineer,AWS Big Data Speciality Ceication ? 7+ years direct experience working in Enterprise Data Warehouse technologies ? 3+ years in a customer facing role working with enterprise clients ? Experience with implementing and/or maintaining technical solutions in viualized environments. ? Experience in design, architecture and implementation of Data warehouses, data pipelines and ows. ? Experience with developing soware code in one or more languages such as Java, Python and SQL. ? Experience designing and deploying large scale distributed data processing systems with one or more technologies such as Oracle, MS SQL Server, MySQL, PostgreSQL, MongoDB, Cassandra, Redis, Hadoop, Spark, HBase, Veica, Netezza, Teradata, Tableau, Qlik or MicroStrategy.Customer facing migration experience, including service discovery, assessment, planning, execution, and operations. ? Customer facing migration experience, including service discovery, assessment, planning, execution, and operations. ? Demonstrated excellent communication, presentation, and problem solving skills. ? Willingness to travel around 30%-40%.
                      Jun-08-20
                      Data Scientist  Chicago, IL
                      5+ years of hands on Data Science experience Experience with R or Python Experience with data querying languages, and statistical or mathematical programming language Expertise working with large data sets, data mining, and machine learning tools Excellent communication and analytical skills
                      Jun-08-20
                      Title: Senior Data Scientist Location: Sunnyvale, CA or Austin, TX Duration: Long Term Education: Bachelors or Masters in Computer Science or Engineering with relevant industry experience Required Experience/Skills: - Strong in ML/NLP/Statistics/Engineering Mathematics - Strong understanding in Statistics, Optimization Techniques and Time Series (ARIMA) with academic and industry experience. - Worked as principal contributor and lead in end-to-end phases for building statistical data science and NLP projects - Hands on experience in Python, Java, Bootstrap, BPM, Oracle SQL, Chart JS. - Experience and understanding of SOA, REST web services and micro services, BPN 2.0, CI/CD in Agile environment - Team player, self-motivated, innovator with strong interpersonal skills  
                      Jun-08-20
                      Title: Senior Data Scientist Location: Sunnyvale, CA or Austin, TX Duration: Long Term Education: Bachelors or Masters in Computer Science or Engineering with relevant industry experience Required Experience/Skills: - Strong in ML/NLP/Statistics/Engineering Mathematics - Strong understanding in Statistics, Optimization Techniques and Time Series (ARIMA) with academic and industry experience. - Worked as principal contributor and lead in end-to-end phases for building statistical data science and NLP projects - Hands on experience in Python, Java, Bootstrap, BPM, Oracle SQL, Chart JS. - Experience and understanding of SOA, REST web services and micro services, BPN 2.0, CI/CD in Agile environment - Team player, self-motivated, innovator with strong interpersonal skills  
                      Jun-08-20
                      Miracle software systems is looking for Sr. Informatica MDM Developer for Owings Mills, MD, please find the below job details position : Sr. Informatica MDM Developer Duration : 12 Months Location : Owings Mills, MD Primary skills: 5 years of experience in Informatica MDM 10.X customizing/configuring Customer360, E360, Informatica Data Director (IDD) Secondary skills: 2 years of experience in SIF APIs 10 years of experience in informatica power center 3 years of experience in IDQ ( analyst and Developer) Required: Minimum of 5 years of experience in informatica MDM, preferably should have experience in 10.x Minimum of 2 years of experience installing informatica MDM environments. Should be proficient in promoting objects to upper environments Should have experience with designing merge/match/validation rules and implement Expertise in customizing/configuring Customer360, E360, Informatica Data Director (IDD) applications and ActiveVOS approval workflow Minimum of 2 years of experience in SIF APIs Minimum of 10 years of experience in informatica power center Minimum of 3 years of experience in IDQ ( analyst and Developer) Minimum 2 years in UNIX shell scripting Experience in working with Party data model Experience with Address Doctor in IDQ and MDM to cleanse address data Good understanding of data governance/data stewardship Experience working in multi-vendor/multi-cultural environments, preferably should have experience leading offshore teams. AWS experience is preferred. Self-starting and flexible team player with good communication skills to develop innovative solutions
                      Jun-08-20
                      MDM Developer  Owings Mills, MD
                      Job Title: MDM Developer Client: T. Rowe Price - MD Location: Richfield, MN Required Skills: PowerCenter, UNIX Job Description: Top Skills Required: MDM experience installation and development Informatica PowerCenter IDQ UNIX Experience Required: Minimum of 5 years of experience in informatica MDM, preferably should have experience in 10.x Minimum of 2 years of experience installing informatica MDM environments. Should be proficient in promoting objects to upper environments Should have experience with designing merge/match/validation rules and implement Minimum of 2 years of experience in SIF APIs Minimum of 10 years of experience in informatica power center Minimum of 3 years of experience in IDQ ( analyst and Developer) Minimum 2 years in UNIX shell scripting Experience in working with Party data model Experience with Address Doctor in IDQ and MDM to cleanse address data Good understanding of data governance/data stewardship Experience working in multi-vendor/multi-cultural environments, preferably should have experience leading offshore teams. Self-starting and flexible team player with good communication skills to develop innovative solutions Preferred Experience: AWS experience is preferred. Previous financial industry experience
                      Jun-08-20
                      ($) : USD 65 / Hourly / C2C
                      Job Title: BODS ETL Developer Location: Oakland, CA Duration: Contract Job Description: Ability to analyze, design, and create end-to-end Business Objects Data Integrator (BODS) mapping documents. Ability to develop and test ETL applications and perform ETL troubleshooting using SAP BODS. Build, maintain, and enhance all objects/packages/functions in PL/SQL to support application process. Perform functions including SQL tuning, database application design, and developer support. Preferred: Knowledge of Java web development preferred. Develop new java code and execute new features through business requirements. Participate in developing new web applications from beginning till end in functional system. Must be comfortable working in a Windows and Unix environment. Demonstrated ability in creative, out-of-the-box thinking to solve complex technical problems is desired. 5+ years of related work experience programming with data integration or ETL tools such as Business Objects Data Services (BODS) or Informatica, and 5+ years of related work experience in developing complex SQL queries, stored procedures, functions, and packages in an Oracle or SQL server database. Strong knowledge of system development principles, procedures and formal SDLC methodologies required. Strong knowledge of programming concepts and techniques required to design and develop ETL interfaces, ETL workflows and dataflows to transfer data from various complex formats. Detail oriented and problem-solving capabilities Knowledge and application of English grammar, composition, editing and proofreading skills Strong organizational/time management and project management skills and multi-tasking abilities
                      Jun-08-20
                      Hi, There is an urgent requirement for the following. Please let me know your interest ASAP. Position: Datastage developer Location: Parsippany, NJ Duration: Long term Looking for solid datastage resource having very strong datastage expertise · Minimum 6-8 years of hands on datastage resource · Should worked with Json format data · Must have solid shell scripting hands on experience · Must know how to tune data stage jobs · Create audit frame work for data stage jobs · Must have data migration experience · Working experience on handling different types of files like main frame/delimited etc Swati Singh Technical Recruiter Xchange Software Inc 10 Austin Avenue, Iselin, NJ - 08830. Direct Fax www.xchangesoft.com
                      Jun-08-20
                      ($) : USD 60 / Hourly / C2C
                      Job Title: ETL QA Lead Location: Phoenix, AZ Duration: Long Term Contract Required Skills: Cloud or on-prem ETL based frameworks. Unix/Python Scripting, SQL, ETL Tools -informatica/datastage Job Description Test planning, Test data creation, test execution for a new ETL development or any data migration efforts Experience in testing methods - Unit, Integration, Systems, End-to-End, UAT, Performance etc. Test Automation framework - Cloud or on-prem ETL based frameworks Understand data pipelines and modern ways of automating data pipeline using cloud based and on premise technologies Ability to develop scripts (Unix, Python etc to do Extract, Load and Transform data Experience developing SQL scripts and stored procedures Understanding of various data formats such as CSV, XML, JSON, PARQUET etc ETL Tools -informatica/datastage Experience with Snowflake cloud based data warehouse Integrate on-premise infrastructure with public cloud (AWS) infrastructure Translate requirements for BI and Reporting to Database design and reporting design
                      Jun-08-20

                      Understanding Data Warehouse & ETL

                      A Data Warehouse is a huge database designed solely for data analysis. It is not used for any standard database process or business transactions. ETL (Extract, Transform, and Load) is a process by which normalized data is extracted from various sources and loaded into a data warehouse or a repository. This is then made available for analysis and querying. This repository is transformed to remove any redundancy and inconsistency. ETL supports effective decision making and analytics based on the composite data. Slices of data from the data warehouse can be stored in a data mart. This enables quick access to specific data like the summary of sales or finance reports. Search for exciting IT job openings in New Jersey.

                      Data Warehouse Features & Capabilities

                      Data Warehouse has features and capabilities that support data analysis with ease. A good data warehouse should have the following abilities: ? Interact with other sources and input; extract data using data management tools. ? It should be able to extract data from various sources, files, Excel, applications, and so on. ? Allow cleansing so duplication and inconsistency can be removed. ? Reconcile data to have standard naming conventions. ? Allow both native and autonomous storage of data for an optimized process.

                      Top ETL Tools to excel in Data Warehousing Jobs

                      There are many ETL tools available in the market. The most commonly used tools for ETL are given below. ? Sybase ? Oracle Warehouse Builder ? CloverETL ? MarkLogic. There are excellent data warehousing tools like Teradata, Oracle, Amazon Web Services, CloudEra, and MarkLogic. Expertise in any of these can fetch you a good job in the field of data warehousing.

                      Salary Snapshot for Data warehousing Jobs in US

                      A senior Data Warehouse developer receives an average pay of $123,694 a year. Based on the skill and expertise the salaries in this field can range anywhere from $193,000 to $83,000. Most of the Senior Data Warehouse Developer receives a salary that ranges between $103,500 to $138,000 in the United States. There are currently plenty of Data Warehouse developer jobs in USA.

                      Career Path for a Data Warehouse Professional

                      Data Warehouse gives immense opportunities for an IT professional. There are a plethora of roles and designations required to manage this vast application and its different modules. Data warehouse managers are software engineers who build storage mechanisms for organizations to meets the need of the company. Entry-level roles in Data Warehouse are Software Developer, Software Engineer, Business Intelligence (BI) Developer, and Data warehouse ETL Developer People who make use of the data in the Data Warehouse to arrive at various decisions are Data Analyst, Data Scientist, and Business Intelligence (BI) Analyst. Senior roles in this field are Data Warehouse Managers, Senior Financial Analyst, Senior Software Engineer / Developer / Programmer, and Senior Business Analyst. Data warehousing jobs in USA are still prevalent, and if you are a specialist in this field, you can make a great career out of it.
                      Data warehouse & Skills & Tools
                      To be a Data Warehousing professional, you need an in-depth understanding of the database management system and its functions. Experience in developing databases using any of the database applications will be an added advantage. Apart from this, other technical skills required for a Data Warehousing job are discussed below: ? Tools for developing ETL. You can either develop ETLs by creating mappings quickly or build it from scratch. Some commonly used ETL tools are Informatica, Talend, Pentaho. ? Structured Query Language or SQL is the backbone of ETL. You must know SQL as it is the technology used to build ETLs. ? Parameterization is another crucial skill to master. ? Knowledge in any of the scripting languages used in a database application, like, Python, Perl, and Bash, will come in handy. ? Debugging is another essential technical skill as nothing ever goes as planned. Apply for top tech jobs from other US states and cities.
                      ($) : USD 130 / Yearly
                      VDart We are a Global Information Technology Services & Workforce Solutions firm headquartered out of Atlanta, GA with presence in US, Canada, MX, UK, Belgium, Japan & India. Founded in 2007, Our team of over 2550+ professionals continually create impact for our customers worldwide in solving complex technology challenges with cutting edge technologies. We specialize in providing the Fortune 1000 companies, niche hard to find skills in technologies including Social, Mobile, Big Data Analytics, Data Sciences, Cyber Security, IoT, Cloud, Machine Learning, and Artificial Intelligence. With delivery centers in the UK, Mexico, Canada, and India, we provide global workforce solutions to our customers covering EMEA, APAC & Americas. VDart is an award-winning organization recognized by Inc 5000 Hall of Fame; Atlanta Business Chronicle*s Fastest Growing Companies; NMSDC*s National Supplier of the Year; Ernst & Young*s Regional Entrepreneur of the Year and more. Title: Data Architect Location: San Ramon, CA Duration: Full time Number of position: 1 Key Skills: Cloud Experience, Data Modeling , Data Security Responsibilities Strategize, Design & own the implementations to support new data solutions that become the foundation of Customer transaction and business data platform. Collaborate with Data Architect and end consumers across the organization to understand and prioritize solutions. Set standards and processes to drive data quality across our transaction and business data access layer. Create and communicate a roadmap for the next generation transaction and business data platform. Identify innovative solutions and emerging technologies that improve Chevron's speed to insight. Maintain a methodical approach for data modeling, with an attention to detail that makes using Chevron's data a joy for all. Requirements 12+ years total of demonstrated success in data architecture in large complex fast-changing business environments. Experience with streaming data solutions, customer data platforms, and data virtualization technologies. Experience with database technologies, understanding their strengths, weaknesses and appropriate use cases Ability to easily partner with cross-functional teams to gather, understand and distill requirements. Ability to create clear and detailed technical diagrams and documentation. Drive architecture, design and modeling sessions, with ability to communicate, own the outcome and achieve shared vision. Experience implementing data governance and data security processes. Experience to engage "hands-on" with data design and modeling. Understanding of Machine Learning use cases preferred. Skills: "Data Architect" "Data engineer" "Data analyst" "Bigdata Architect" "ETL Architect" "Data Modeler" "Data Architect" "Data engineer" "Data analyst" "Bigdata Architect" "ETL Architect" "Data Modeler" "Data Architect" "Data engineer" "Data analyst" "Bigdata Architect" "ETL Architect" "Data Modeler" "Data Architect" "Data engineer" "Data analyst" "Bigdata Architect" "ETL Architect" "Data Modeler" "Data Architect" "Data engineer" "Data analyst" "Bigdata Architect" "ETL Architect" "Data Modeler" "Data Architect" "Data engineer" "Data analyst" "Bigdata Architect" "ETL Architect" "Data Modeler" "Data Architect" "Data engineer" "Data analyst" "Bigdata Architect" "ETL Architect" "Data Modeler" "Data Architect" "Data engineer" "Data analyst" "Bigdata Architect" "ETL Architect" "Data Modeler" "Data Architect" "Data engineer" "Data analyst" "Bigdata Architect" "ETL Architect" "Data Modeler" Referral Program: Ask our recruiting team about how you can be a part of our referral program. If you refer a candidate with the desired qualifications and your candidate accepts the role, you can earn a generous referral fee. We want to hire the best talent available and are committed to building great teams and partnerships. We are Equal Employment Opportunity Employer. VDart Inc Alpharetta, GA Follow us on Twitter for the hottest positions: @VDart_Job Follow us on Twitter: @vdartinc
                      Jun-08-20
                      Hi, This is Sarath from Avtech solution Inc, Hope you are doing great! Please review the following job description and let me know if you are available and willing to apply Role : Informatica Developer Location: Malvern, PA Duration: 12 months Skills: DX/MFT skills using the Informatica tool is desired 6+ years of software development experience is required. Experience with the development of B2B integration solutions is required. Experience with EAI, ESB is a plus Experience with Informatica B2B DX DT, Informatica PowerCenter, ETL Development is required. Proficiencies in various endpoint types supported by DX are required. Must be able to create data interchange endpoints using JMS message queues, directories, and managed file transfers. Must be able to create and manage partners and accounts, profiles, applications and workflows. Must be able to create PowerCenter mappings and workflows to process documents, incorporating B2B DX and B2B DT transformations when required. Proficiencies in HL7, EDI libraries as well as JSON and XML messaging are preferred. Experience in Linux and UNIX is required. Experience in integrating with Oracle and SQL Server databases using JDBC is required. Experience in applying software development methodologies (SDLC, Agile) is preferred. Thanks & Regards, Sarath | IT Recruiter AVTECH Solutions Inc. Phone-Ext(509) Email: Web: www.avtechsol.com
                      Jun-07-20
                      Greetings from Avtech solutions!! We have an opening for "Informatica Developer" requirement for one of our client engagement, below is the job description. If your Interested Kindly Call me Or Mail me ASAP. Position: Informatica Developer Location: Malvern, PA Duration: 12 months Visa: USC,GC,H1B, TN Skills: DX/MFT skills using the Informatica tool is desired 6+ years of software development experience is required. Experience with the development of B2B integration solutions is required. Experience with EAI, ESB is a plus Experience with Informatica B2B DX DT, Informatica PowerCenter, ETL Development is required. Proficiencies in various endpoint types supported by DX are required. create data interchange endpoints using JMS message queues, directories, and managed file transfers. create and manage partners and accounts, profiles, applications and workflows. create PowerCenter mappings and workflows to process documents, incorporating B2B DX and B2B DT transformations when required. Proficiencies in HL7, EDI libraries as well as JSON and XML messaging are preferred. Experience in Linux and UNIX is required. Experience in integrating with Oracle and SQL Server databases using JDBC is required. Experience in applying software development methodologies (SDLC, Agile) is preferred. Thanks & Regards, Raj - Rajesh Kumar AVTECH Solutions Inc Ext -504 (Voice)
                      Jun-06-20
                      ($) : $ANNUAL
                      Job-Title: Data Engineer with Netezza ExpLocation: Mountlake Terrace, WAThe Data Engineer II creates modifies, and tests the code, forms, and scripts that construct the data sets that drive our Data Science and Business Intelligence capabilities. This work is derived from specifications developed in collaboration with those consuming or interacting with those data sets. The Data Engineer may develop and write solutions to store, locate, and retrieve specific documents, data, and information and distribute through multiple methods. They will perform ETL (extract, transform, load) processes. As they continue to build knowledge and expertise in this area, they will follow professional standards and best practices as well as departmental guidelines to independently solve problems of moderate scope and complexity.What they will do:Solve business and data science problems using data centric programming and scripting skills to create data models and pipelines.Work closely with the business to create understanding of the needs, pace and direction for our business partners. T, translate these needs into requirements and specifications, and maintain contact with the customers throughout project completion.Troubleshoot issues as they arise and solve problems independently and collaboratively.Collaborate with data scientists and other analysts to further understand business problems.Develop code to complete effective solutions using applicable technology.Perform thorough peer design and code reviews.Use developing data ETL experience to develop data pipelines to support data product automation.Other duties as assigned.What they will have:The typical incumbent will have a bachelors degree in computer science, computer engineering, or similar area and two-five (2-5) years experience in data integration, design and management. Additionally, they will have knowledge of software development lifecycle, relational database theory, and skills to utilize one or more programming languages.At a minimum, candidates must possess the equivalent in education, experience, and skills of a bachelors degree in a related field and two (2) years relevant experience.Familiarity with healthcare specific regulatory requirements for data management. (Preferred)Experience providing data integration services within healthcare organizations. (Preferred)Knowledge of Tableau, SAS, R, and other analytic tools. (Preferred)What they bring:Good problem definition, analytical, problem solving skills, and technical writing skills.Strong data processing programming skills across SQL-based and Hadoop-based technologies.Good written and verbal communication skills and ability to deal with management of various levels and communicate information and ideas in writing so others will understand.Knowledge of Agile and Scrum project methodologies utilizing TFS, Jira/Confluence.Ability to use Extract Transform Load (ETL) tools (SSIS, Data Stage, Cask)Ability to use Kimball methodology for dimensional data modeling, 3rd Normal form DW.
                      Jun-06-20
                      ($) : Market
                      Senior Business Data Analyst Charlotte NC 12-24 months Role starts off remote then moves onsite when COVID ban is lifted Candidates must have · 7+ years of data analysis, profiling and exploratory skills · 5+ years in data warehousing background (understanding concepts and knowing about dimensional vs relational databases) · Strong relational/dimensional understanding of data structures · Must have strong SQL skills (7 or 8/10) – will be working on Teradata platform (Teradata not required) · Experience writing BRD/FSD · Will be doing Data remediation, data profiling, data cleansing · Microstrategy reporting is a plus but not required
                      Jun-05-20
                      ($) : Market
                      Job Description: MUST HAVE skills: Development using Informatica PowerCenter Experience using Oracle PL/SQL and Microsoft SQL queries and syntax Position Summary: · Under minimal supervision, designs, codes or configures, tests, debugs, deploys, documents and maintains programs using a variety of software development toolkits, programming languages, testing/verification applications and other tools, while adhering to specific development best practices and quality standards. . · Specifically, experience developing with either Informatica PowerCenter or Infor (Lawson) Process Automation (IPA · Gathers business requirements, translating that information into detailed technical specifications from which programs will be written or configured, and validating that the proposed applications align with both the architectural design and the business needs. · Participates in process leadership for work groups, and product/service delivery strategy and work plans. Other responsibilities may include deep troubleshooting and issue analysis, as well as coding, testing and implementing software enhancements and/or applying patches. · Competent to work on most phases of applications systems analysis and programming activities, but requires instruction and guidance in others. · Provides coaching to less experienced Application Development Analysts. Qualifications: Basic Qualifications: Education - Associate''s Degree or equivalent in related field Experience - 2 years of experience typically gained through skills/knowledge/abilities in developing and supporting applications, and or system development lifecycle including coding, testing, and implementation Preferred Qualifications: Education - Bachelor''s Degree in related field Development using either Informatica PowerCenter or Infor (Lawson) Process Automation (IPA) Experience using Oracle PL/SQL and Microsoft SQL queries and syntax
                      Jun-05-20
                      ($) : Market
                      Data Analyst III/Data Scientist Minneapolis, MN/Charlotte, NC 6 Months Contract Must Have Experience in Banking Domain Job Description: · Gather business requirements, researching and building a detailed understanding of the problem and related data assets (data discovery) in order to code the data processing and analysis. · Ensure that the data used follows the compliance, access management and control policies of the company while meeting the data governance requirements of the business. · Work with technical groups to support the collection, integration and retention of the data sources. · Apply data visualization and summarization techniques to the analytical results. · Interpret and communicate the results in a manner that is understood by the business.
                      Jun-05-20
                      Immediate need for a DataStage Admin. This is a 24 months contract opportunity with long-term potential located in Mahwah NJ. Please review the job description below: Job ID: 20-18359 Key Responsibilities: Candidate will Support multiple IBM DataStage environments. Responsibilities include product upgrades, patching, security compliance, health checks, new installations and customer support. Key Requirements and Technology Experience:? DataStage version upgrades 11.5 or better DataStage installation and patching versions 11.5 or better DataStage server install base of 50 or more on RHEL 7 DataStage installation and support of tiers on multiple servers DataStage Operations console and database queries DataStage Client installations/upgrades and troubleshooting DataStage Work Load Manager configurations DataStage System/Project configurations Experienced in Jenkins, Ansible, Git, Bitbucket, Service Now Security configuration and remediation (security scans / pen test remediation’s / admin groups / SSH/ TLS 1.2 etc) Linux systems scripting? Health Checks System performance Work with IBM on PMR’s / issues Troubleshooting / root cause analysis Our client is a leading Supply Chain Industry and we are currently interviewing to fill this and other similar contract positions. Qualified candidates should apply online for immediate consideration.
                      Jun-05-20
                      Greetings from Avtech Solutions Inc, Role: Data Scientist Hire Type: 6+ Months project Location: Great Neck NY (REMOTE TILL COVID LOCK DOWN) Experience: Mid- Level 7-9years Visa: GC, USC Primary Job Responsibilities The data scientist will leverage existing data technologies to aggregate, transform, and perform meaningful feature engineering that includes structured transactional data and unstructured natural language data, You will perform feature engineering and statistical analysis across heterogeneous sources of textual data, and build algorithmic solutions. As a member of the team, you perform data analysis, ensure data quality, and develop tracking and reporting systems to determine the effectiveness of models. Design and create systems to structure, aggregate, and turn petabytes of messy information into statistically significant features for modeling purposes. Required Skills and Experience: Degree in Computer Science or quantitative field, Entry to mid-level role with at least 8-10 projects related to data cleaning, transformation and feature Engineering Experience in SQL, relational databases, Hadoop framework Expertise in machine learning packages Python, knowledge of Paxata and Data Robot a Plus. At least 2-3 projects with unstructured data and applied knowledge of Natural Language Processing (NLP), Regards, Rahul S | AVTECH Solutions Inc. Phone EXT 502 Email: LinkedIn: https://www.linkedin.com/in/srahuls/
                      Jun-05-20
                      ($) : USD Negotiable
                      Informatica MDM Tester Remote Contract Key Responsibilities include: Drive solution architecture decisions & collaborate with client in designing the MDM Architecture Road map. Create and maintain optimal data model and solution architecture for our clients. Provide guidance to perform data quality analysis on large and complex data sets. Provide guidance and leadership to technical team throughout the implementation. Identify areas for data quality improvements and helps to resolve data quality problems through the appropriate choice of error detection and correction, process control and improvement, or process design strategies. Provide assistance in resolving data quality problems through the appropriate choice of error detection and correction, process control and improvement, or process design strategies collaborating with subject matter experts (SMEs) and data stewards. Perform technology stack evaluation and suggest optimal stack for the client. Experience in development of data pipelines to integrate data from source to MDM and vice versa Work with different stakeholders including the executive, product, data and design teams to assist with data - related technical issues and support their data infrastructure needs Should be able to develop complex cross domain hierarchy Design business process workflow to support data governance and data stewardship Actively manage risks for data and ensure there is a data recovery plan Required Technical Skills: Experience - 5 to 10 years Hands - on experience in client projects Develop, enhance and support large scale MDM project implementation Monitor, analyze and participate in Business requirement review, data loading, data modelling etc. Directly interact with client in to lead and handle all design and development activities Experience with Java enterprise Good to have experience in AWS services like S3, SQS etc. Good to have experience in integration platform like SnapLogic, MuleSoft etc. Willing to learn new MDM tools such as Semarchy, Profisee etc.
                      Jun-05-20
                      ($) : Market
                      VDart We are a Global Information Technology Services & Workforce Solutions firm headquartered out of Atlanta, GA with presence in US, Canada, MX, UK, Belgium, Japan & India. Founded in 2007, Our team of over 2550+ professionals continually create impact for our customers worldwide in solving complex technology challenges with cutting edge technologies. We specialize in providing the Fortune 1000 companies, niche hard to find skills in technologies including Social, Mobile, Big Data Analytics, Data Sciences, Cyber Security, IoT, Cloud, Machine Learning, and Artificial Intelligence. With delivery centers in the UK, Mexico, Canada, and India, we provide global workforce solutions to our customers covering EMEA, APAC & Americas. VDart is an award-winning organization recognized by Inc 5000 Hall of Fame; Atlanta Business Chronicle*s Fastest Growing Companies; NMSDC*s National Supplier of the Year; Ernst & Young*s Regional Entrepreneur of the Year and more. Title: Informatica MDM Functional Tester Location: 100% Remote Duration: 3 to 6 Months Contract Rate: $Negotiable Project Details: It's an offshore centric project and looking at hiring for some key roles here in the US. Working Hours: Flexible to work at India time zone at least for 5 hrs. Key Responsibilities include: Drive solution architecture decisions & collaborate with client in designing the MDM Architecture Road map. Create and maintain optimal data model and solution architecture for our clients. Provide guidance to perform data quality analysis on large and complex data sets. Provide guidance and leadership to technical team throughout the implementation. Identify areas for data quality improvements and helps to resolve data quality problems through the appropriate choice of error detection and correction, process control and improvement, or process design strategies. Provide assistance in resolving data quality problems through the appropriate choice of error detection and correction, process control and improvement, or process design strategies collaborating with subject matter experts (SMEs) and data stewards. Perform technology stack evaluation and suggest optimal stack for the client. Experience in development of data pipelines to integrate data from source to MDM and vice versa Work with different stakeholders including the executive, product, data and design teams to assist with data - related technical issues and support their data infrastructure needs Should be able to develop complex cross domain hierarchy Design business process workflow to support data governance and data stewardship Actively manage risks for data and ensure there is a data recovery plan Required Technical Skills: Experience - 5 to 10 years Hands - on experience in client projects Develop, enhance and support large scale MDM project implementation Monitor, analyze and participate in Business requirement review, data loading, data modelling etc. Directly interact with client in to lead and handle all design and development activities Experience with Java enterprise Good to have experience in AWS services like S3, SQS etc. Good to have experience in integration platform like SnapLogic, MuleSoft etc. Willing to learn new MDM tools such as Semarchy, Profisee etc. Referral Program: Ask our recruiting team about how you can be a part of our referral program. If you refer a candidate with this background and if the candidate accepts the role our team pays a generous referral. We are keen on networking and establishing a long-term, mutually beneficial partnership with you. We are Equal Employment Opportunity Employer. VDart Inc Alpharetta, GA Follow us on Twitter for the hottest positions: @VDart_Jobs Follow us on Twitter: @vdartinc
                      Jun-05-20
                      ($) : USD 90 / Hourly / C2C
                      Title: AWS Architect Location: Owings Mills, MD Duration: Contract / Fulltime Client: Capgemini Sogeti Job Description: The AWS Architect should have expereince with -Cloud architecture - Data pipelines, data engineering Strong AWS data architect. They will work across the DMS ecosystem (RPS, II, Shared Services, etc) not just on one project. Hands on AWS cloud architect! Current eco system Java, Databases, Aurora, Websphere, Spark, Python, Step functions, Scripting, Orchestration, Airfolo, some data engineering along CI/CD pipeline. They will need good leadership skills Ability to drive initiatives forward Customer focused.
                      Jun-05-20
                      Data Architect  Raleigh, NC
                      ($) : DOE
                      Data Architect requirement (with experience in database architecture, data mapping, migration and ETL etc) As a Data Architect you would work with enterprise architects, business analysts/SMEs, and technical architects to ensure that our solutions and technologies align with our customer’s expectations. Your key focus areas would include. Data Ingestion & data migration Data Quality Data Governance Master Data Management In general the Data Architect role would require that you, Work with the field force to prepare solutions, estimates, project plans and qualify requirements. Drive technology aspects of solution architecture design and implementation activities across projects, teams and verticals. Create data models, functional & technical specifications and data flows Enhance our in-house solutions & offerings Take on multiple technologies The ideal candidate would have, 10 years of experience in enterprise data management including, o 3-5 years experience in ETL tools preferably on Informatica o 2+ years experience in at least 1 Data Quality tool, preferably on Informatica o 3-5 years experience in data governance, or master data management projects preferably on Informatica. o Knowledge of data modelling – 3 years o Experience in preparing requirement, functional and technical documents. o Experience in preparing proposals, solutions, presentations, project plans and estimations
                      Jun-04-20
                      Data Scientist (5 positions) Location: Kenilworth, NJ Duration: 6-18 months contract to hire Description The Senior Predictive Modeler / Senior Data Scientist who will apply machine learning and statistical modelling methodologies to build the predictive models on product market dynamics, population, revenue, and product launches and other market events. Your responsibilities will include: · Ensuring quality delivery according to the standards and best practices methodology · Coordinating the delivery of the team and managing the backlog and risks during the modelling · Driving the development of the SW code (quality, documented, and reusable) to execute the predictive models · People management and talent development to enable people to act as a data science practitioner (see e.g. points below) · Communication & collaboration with different stakeholders · Understanding of the business context of the decisions to be supported · Suggesting the appropriate mathematical approach · Building mathematical or statistical models using the available data · Use of R to design and develop analytical models for advanced techniques such as forecasts, simulations, and optimizations · Assessment of the output of analytical solution and use the data to draw conclusions, identification of options, and making recommendations · Interpret the results and provide the advices to the business stakeholders · Building and maintaining of knowledge on the data sources, the data quality and metadata · Recognition of the repeatable situation and suggesting the appropriate level of automation of analytics · Promotion of a culture of modelling and analytics as a differentiator and competitive advantage – an environment that places high value on embedding analytical tools within business processes, and using information to make fact-based decisions. · Stay in touch with modern methodology within predictive modelling Qualifications Qualifications and Education Minimum Requirement · MSc Degree in Mathematics, Statistics, Economics, or other related field. Desired Experience and Skills (not compulsory) · Passion for advanced analytics (mathematics) and applying it to predictive modelling · The ability to formulate models, analyse them, and program and run simulations in some appropriate system (for example R) · Extensive knowledge in one or more of the following areas: Mathematical analysis and modelling, probability, stochastic processes, econometrics, time series · Demonstrated communication and interpersonal skills, including the ability to make high-level presentations for senior executives · Strong analytical and problem-solving skills and ability to work with incomplete or imperfect data, including ability to interpret and use the analytics results within the real business context · Ability to work both independently and collaboratively within a globally dispersed team · Proven track records with creation of the analytical SW products of large scope where the collaboration tools and version control systems were leveraged
                      Jun-04-20
                      Data Scientist  Redmond, WA
                      Additional Notes Need at least 4 to 5 years purely into Data Science Work Authorization Open Job Description Advanced understanding of probability, statistics, machine learning, data science Expertise in data correlation/feature analysis, analysis of machine learning models, and optimizing models for accuracy Proficiency in transforming and cleaning data & working across multiple models Ability to research and manipulate complex and large data sets (both distributed and non-distributed) Strong fundamentals in problem solving, algorithm design, and model building Ability to solve complex business problem through a blend of logical and creative thinking Strong communication skills with excellent ability to synthesize complex information Code writing capability in Python and familiarity with relevant ML packages Familiarity with libraries such as Pandas, Scikit-learn etc. Extensive working experience on NLP, NLU, Topic mining , Sentiment Analysis experience is must. Should have worked on NLP packages like – NLTK , Gensim , TextBlob Deep learning experience - added advantage Experience: 7+ Years Education: Master’s degree or higher in Computer Science, Data Science, Engineering, Math, Applied Statistics, or related field.
                      Jun-03-20
                      ($) : Market
                      VDart We are a Global Information Technology Services & Workforce Solutions firm headquartered out of Atlanta, GA with presence in US, Canada, MX, UK, Belgium, Japan & India. Founded in 2007, Our team of over 2550+ professionals continually create impact for our customers worldwide in solving complex technology challenges with cutting edge technologies. We specialize in providing the Fortune 1000 companies, niche hard to find skills in technologies including Social, Mobile, Big Data Analytics, Data Sciences, Cyber Security, IoT, Cloud, Machine Learning, and Artificial Intelligence. With delivery centers in the UK, Mexico, Canada, and India, we provide global workforce solutions to our customers covering EMEA, APAC & Americas. VDart is an award-winning organization recognized by Inc 5000 Hall of Fame; Atlanta Business Chronicle*s Fastest Growing Companies; NMSDC*s National Supplier of the Year; Ernst & Young*s Regional Entrepreneur of the Year and more. Role: Data Analyst Location: Phoenix, AZ Type of Hire: Contract Rate: $Negotiable Job description: DB Stack: SQL, MongoDB, Oracle Enforce, refine and document database standards Work with technical/development leads to develop/ review database designs Assess and identify database server resource requirements Develop and maintain data models Demonstrate advanced understanding of multiple database platforms Code, test and deploy database changes Lead efforts to develop and improve procedures for automated monitoring and proactive intervention to prevent customer impact Review performance stats and recommend changes for tuning database and application queries Assist developers in the development and tuning of database queries, stored procedures, indexes, etc. Participate in periodic backup audits, including test restores ensuring all databases and associated logs are being backed-up Lead requirements assessments for backup and recovery Assume ownership of database-related problems; pursue problem resolution and root cause analysis. Referral Program: Ask our recruiting team about how you can be a part of our referral program. If you refer a candidate with this background and if the candidate accepts the role our team pays a generous referral. We are keen on networking and establishing a long-term, mutually beneficial partnership with you. We are Equal Employment Opportunity Employer. VDart Inc Alpharetta, GA Follow us on Twitter for the hottest positions: @VDart_Jobs Follow us on Twitter: @vdartinc
                      Jun-02-20
                      Data Scientist for State of Georgia Location: Atlanta, GA Duration: 12 Months The OPHI Data Scientist will lead the OPHI analytics team and bring their considerable technical skill and knowledge of data analytic techniques to bear on a broad range of information problems facing the department. The OPHI Data Scientist will develop Primary Duties: Research and develop statistical learning models for data analysis Collaborate across DPH departments to understand organizational needs and devise possible solutions. Create complex SQL queries to generate reports according to the research study requirements. Keep up to date with latest technology trends Communicate results and ideas to key decision makers Implement new statistical or other mathematical methodologies as needed for specific models or analysis Optimize joint development efforts through appropriate database use and project design Analyzes, summarizes, and/or reviews data including report findings, interpreting results, and making recommendations. Establishes methodologies and protocols for collecting, retrieving, analyzing, and maintaining programmatic data across program areas. Required Skills: The current need is stronger on the data side vs the technical side (but this skillset doesn’t hurt at all) where utilization of analytic methodology are at an expert level, as well as, the use of various analytic applications (i.e. Qlik, Tableau, R, Power BI, Big Query, etc ‘’ Experience with R, Python, Oracle, ETL, data processing, database programming and data analytics
                      Jun-01-20
                      ($) : Market
                      Hi Hope you are doing well ! Have a look on below Jds and let me know your view ASAP. Role : AWS Data Architect location :Charlotte NC Duration : 12+ months W2 Position We are seeking an Architect level consultant this is not an Engineer level role this person must be experienced in building out large data lake solutions end to end 5+ years in pure data solutions consulting must have recent focus on AWS Cloud Requires solid experience with native AWS technologies for data and analytics such as Redshift Spectrum, Athena, S3, Lambda, Glue, EMR, Kinesis, SNS, CloudWatch, etc. Experience architecting/building real time data ingestion/delivery streams Experience with NoSQL databases like DynamoDB/Mongo DBExperience transitioning on premise big data platforms into cloud based platforms Stay free from Covid-19 "Prevention is better than cure " Thanks & Regards Kumar Sri Bharath Smart Folks Inc Email : Contact
                      Jun-01-20
                      Hi , We have opening for below position Please go through JD & share suitable profile Contract Position 1: Role: Cloud Engineer Location: Denver, CO(Locals Only) Visa: Only GC/USC/H1 ONLY CANDIDATES LOCAL TO DENVER Thanks, Priyanka| IT Recruiter AVTECH Solutions Inc.
                      Jun-01-20
                      Data Scientist  San Bruno, CA
                      Job Title : Data Scientist Client Company : Walmart.Com City : San Bruno State : California Only Locals please Job Description : Job Duties: Design and build new data set processes for modeling, data mining, and production purposes. Determine new ways to improve data and search quality, and predictive capabilities. Perform and interpret data studies and product experiments concerning new data sources or new uses for existing data sources Develop prototypes, proof of concepts, algorithms, predictive models, and custom analysis. Minimum Qualifications - PhD in Computer Science, Statistics or related field; OR a Master s degree or equivalent in Computer Science, Statistics or related field and 2 years of related experience. - Knowledge of machine learning, information retrieval, data mining, statistics, NLP or related field. - Programming skills in one of the following languages: Java, Scala, C/C++. - Knowledge of one of the scripting languages such as Python or Perl. - Experience analyzing and interpreting the results of product experiments. Knowledge of statistical languages such as R. - Experience working with large data sets and distributed computing tools (Map/Reduce, Hadoop, Hive, or Spark - Working knowledge of Relational Data Base Systems and SQL. - Experience managing end-to-end machine learning pipeline from data exploration, feature engineering, model building, performance evaluation, and online testing with big data set. - Excellent communications and organizational skills - Prior experience in this area with eCommerce or Online Retail would be a plus.
                      Jun-01-20
                      ($) : USD 100 / Hourly / C2C
                      VDart We are a Global Information Technology Services & Workforce Solutions firm headquartered out of Atlanta, GA with presence in US, Canada, MX, UK, Belgium, Japan & India. Founded in 2007, Our team of over 2550+ professionals continually create impact for our customers worldwide in solving complex technology challenges with cutting edge technologies. We specialize in providing the Fortune 1000 companies, niche hard to find skills in technologies including Social, Mobile, Big Data Analytics, Data Sciences, Cyber Security, IoT, Cloud, Machine Learning, and Artificial Intelligence. With delivery centers in the UK, Mexico, Canada, and India, we provide global workforce solutions to our customers covering EMEA, APAC & Americas. VDart is an award-winning organization recognized by Inc 5000 Hall of Fame; Atlanta Business Chronicle*s Fastest Growing Companies; NMSDC*s National Supplier of the Year; Ernst & Young*s Regional Entrepreneur of the Year and more. Role: Data Analyst Location: Oakland, CA Duration: 6-12 Months QUALIFICATIONS: Strong SQL, data mapping, and data profiling skills Knowledge of Data transformation, Data management & ETL processes Deep understanding of BI methodologies, Data Architecture principles, Master data and metadata. Hands-on experience working with database technologies like MSSQL, MySQL and leading BI platforms such as Pentaho, QlikView and Microsoft BI tools. Experience developing visualizations and working with BI tools a plus. EDUCATION AND EXPERIENCE: Minimum Bachelor's degree in math, computer science or related field. 5+ years ETL experience with SSIS required. 5+ years of Structured data analysis and data requirements development experience is required. Experience with documenting data architecture, data flows, and business process flows is required. Prior experience with database and model design is required. Experience / Knowledge of health care / health insurance, professional liability insurance/medical industry will be plus. If your skills match our requirements, Click here to Apply. Be sure to reference the job number and title in the subject line. Referral Program: Ask our recruiting team about how you can be a part of our referral program. If you refer a candidate with this background and if the candidate accepts the role our team pays a generous referral. We are keen on networking and establishing a long-term, mutually beneficial partnership with you. We are Equal Employment Opportunity Employer. VDart Inc Alpharetta, GA Click here to Apply Follow us on Twitter for the hottest
                      Jun-01-20
                      ($) : USD 75 / Hourly / C2C
                      VDart We are a Global Information Technology Services & Workforce Solutions firm headquartered out of Atlanta, GA with presence in US, Canada, MX, UK, Belgium, Japan & India. Founded in 2007, Our team of over 2550+ professionals continually create impact for our customers worldwide in solving complex technology challenges with cutting edge technologies. We specialize in providing the Fortune 1000 companies, niche hard to find skills in technologies including Social, Mobile, Big Data Analytics, Data Sciences, Cyber Security, IoT, Cloud, Machine Learning, and Artificial Intelligence. With delivery centers in the UK, Mexico, Canada, and India, we provide global workforce solutions to our customers covering EMEA, APAC & Americas. VDart is an award-winning organization recognized by Inc 5000 Hall of Fame; Atlanta Business Chronicle*s Fastest Growing Companies; NMSDC*s National Supplier of the Year; Ernst & Young*s Regional Entrepreneur of the Year and more. Job Role: Data Analyst/Developer Location: Redmond, WA To be responsible for managing technology in projects and providing technical guidance / solutions for work completion . To develop and guide the team members in enhancing their technical capabilities and increasing productivity. To ensure process compliance in the assigned module| and participate in technical discussions/review. To be responsible for providing technical guidance / solutions To prepare and submit status reports for minimizing exposure and risks on the project or closure of escalations. Required Skills : 7-8 years of relevant work experience as a Data Analyst/Dev and driving business insights. Strong experience in business requirements analysis and associated metrics and data requests. Experience in the design, development, and testing using SQL, T-SQL Experience in data profiling & quick understanding of new data sources. Experience on Data engineering activities to pull data from multiple source systems and Integration of various data sources. Good experience in creating workflows and pipelines for data processing. Strong SQL Query experience with large data sets from multiple sources. Experience on creating data streams in Cosmos and Data Sets in SQL Server. Strong experience on creating Scope script to push & pull Cosmos data. Sqlize the Cosmos streams data through XML Good experience in creating visualizations using Power BI Strong debugging capabilities for faster bug fixing. Knowledge of VSO / GIT would be preferable. Provide correct and accurate response to queries. Nice to have knowledge on C# , power shell and .Net skills. SQL AND Power BI AND Comos AND quot;VSO" or GIT") OR SQL AND Power BI AND Comos AND quot;VSO" or GIT") OR SQL AND Power BI AND Comos AND quot;VSO" or GIT") OR SQL AND Power BI AND Comos AND quot;VSO" or GIT") OR SQL AND Power BI AND Comos AND quot;VSO" or GIT") If your skills match our requirements, Click here to Apply. Be sure to reference the job number and title in the subject line. Referral Program: Ask our recruiting team about how you can be a part of our referral program. If you refer a candidate with this background and if the candidate accepts the role our team pays a generous referral. We are keen on networking and establishing a long-term, mutually beneficial partnership with you. We are Equal Employment Opportunity Employer. VDart Inc Alpharetta, GA Click here to Apply Follow us on Twitter for the hottest positions: @VDart_Jobs Follow us on Twitter: @vdartinc
                      Jun-01-20
                      ($) : Market
                      VDart We are a Global Information Technology Services & Workforce Solutions firm headquartered out of Atlanta, GA with presence in US, Canada, MX, UK, Belgium, Japan & India. Founded in 2007, Our team of over 2550+ professionals continually create impact for our customers worldwide in solving complex technology challenges with cutting edge technologies. We specialize in providing the Fortune 1000 companies, niche hard to find skills in technologies including Social, Mobile, Big Data Analytics, Data Sciences, Cyber Security, IoT, Cloud, Machine Learning, and Artificial Intelligence. With delivery centers in the UK, Mexico, Canada, and India, we provide global workforce solutions to our customers covering EMEA, APAC & Americas. VDart is an award-winning organization recognized by Inc 5000 Hall of Fame; Atlanta Business Chronicle*s Fastest Growing Companies; NMSDC*s National Supplier of the Year; Ernst & Young*s Regional Entrepreneur of the Year and more. Datastage Developer Type: Contract Location: Plano, TX No. of positions: 3 Datastage SQL Unix Snowflake Banking domain a plus If your skills match our requirements, Click here to Apply. Be sure to reference the job number and title in the subject line. Referral Program: Ask our recruiting team about how you can be a part of our referral program. If you refer a candidate with this background and if the candidate accepts the role our team pays a generous referral. We are keen on networking and establishing a long-term, mutually beneficial partnership with you. We are Equal Employment Opportunity Employer. VDart Inc Alpharetta, GA Click here to Apply Follow us on Twitter for the hottest positions: @VDart_Jobs Follow us on Twitter: @vdartinc
                      Jun-01-20
                      ($) : USD 62 / Hourly / C2C - Own Corp
                      VDart is an IT staffing firm based out of Atlanta, GA specializing in Digital & Emerging technologies. Founded in 2007, VDart has over 1700+ employees and contractors spread across 3 continents. We specialize in providing the Fortune 1000 companies, niche hard to find skills in technologies including Social, Mobile, Big Data Analytics, Cloud, Machine Learning, and Artificial Intelligence. With delivery centers in the UK, Mexico, Canada, and India, we provide talent solutions to global customers covering EMEA, APAC & Americas. We provide deep technology and domain expertise in BFSI, Energy & Utility, Technology, CPG & Retail industry verticals. VDart is an award-winning organization recognized on Inc 500 Hall of Fame; Atlanta Business Chronicle's Fastest Growing Companies; NMSDC's National Supplier of the Year; Ernst & Young's Regional Entrepreneur of the Year and more. Please send your resume to for immediate consideration Job Title: Sr. DataStage Developer Location: San Antonio, TX Hiring Mode: Contract (Long Term) Required Skills: Should have 5 to 8 years of experience in ETL Design and Development using IBM Datastage Components. Should have worked at least 5 years in Datastage 9.1 or later. Should have extensive knowledge in Unix shell scripting. Thorough Understanding of DW principles (Fact, Dimension tables, Dimensional Modelling and Data warehousing concepts Research, development, document and modification of ETL processes as per data architecture and modeling requirements. Ensure appropriate documentation for all new development and modifications of the ETL processes and jobs. Should have worked at least 1 year in either Insurance or Banking domain. Should have extensive working experience with different Data Sources like Netezza, Oracle Databases. Should be very good in writing complex SQL queries. Desired Skills: Experience in translating functional or non-functional requirements to system requirements. Exposure to scheduling tools like BMC Control-M. Experience in agile methodologies is desirable. Exposure to Big Data technologies will be an added advantage. End to end understanding of source->ETL->Application layer->Reporting. Must have good Oral, written & presentation skills to interact with business & technical teams on day to day basis as its a client facing role. Should be self-driven and able to run the show with minimal or no assistance. Should have minimum 2 years US experience. If your skills match our requirements, Click here to Apply. Be sure to reference the job number and title in the subject line. Referral Program: Ask our recruiting team about how you can be a part of our referral program. If you refer a candidate with this background and if the candidate accepts the role our team pays a generous referral. We are keen on networking and establishing a long-term, mutually beneficial partnership with you. We are Equal Employment Opportunity Employer. VDart Inc Alpharetta, GA Click here to Apply Follow us on Twitter for the hottest positions: @VDart_Jobs Follow us on Twitter: @vdartinc
                      Jun-01-20
                      J.O#63408, Informatica Administrator MDM, Linthicum, MD Greeting Business Partners, We have multiple openings Informatica Administrator MDM with our clients. Please send submit your consultants Please do not share profiles who feel uncomfortable sharing legal name, DOB, Copies of ID proof, Work authorization and Passport details (Must for H1B, EAD etc Job Title: Informatica Administrator MDM Location: Linthicum, MD Client: GCOM Software Duration: 12+ months Required Skills: Informatica, Linux Job Description: This candidate will support a large initiative for the State of Maryland, supporting the integration of several Health & Human Services agencies. This position will be a long term contract role, 12-24 months in duration. Key Responsibilities: Monitor and coordinate all data system operations, including security procedures, and liaison with infrastructure, security, DevOps, Data Platform and Application team. Ensure that necessary system backups are performed, and storage and rotation of backups is accomplished. Monitor and maintain records of system performance and capacity to arrange vendor services or other actions for reconfiguration and anticipate requirements for system expansion. Assist managers to monitor and comply with State data security requirements Coordinate software development, user training, network management and major/minor software installation, upgrade and patch management. Must demonstrate a broad understanding of client IT environmental issues and solutions and be a recognized expert within the IT industry. Must demonstrate advanced abilities to team and mentor and possess demonstrated excellence in written and verbal communication skills. Qualifications: 4 years Bachelor''s Degree from an accredited college or university with a major in Computer Science, Information Systems, Engineering, Business, or other related scientific or technical discipline. A Master''s Degree is preferred. Product administration experience in RHEL Linux based environment Experience in administering cloud based multi user environment with expertise in planning, designing, building, and implementing IT systems. Experience in installation, configurations and managing Informatica MDM Hub, Service Integration Framework on Informatica MDM platform Thorough understanding of Informatica MDM concepts including, but not limited to: Match Rules Tuning, workflow, IDD etc. Install, Administer, Upgrade and Manage Informatica MDM platform - master data management (previously known as Siperian) including application servers for cluster and high availability needs. Participates in administrative activities related to MDM platform including but not limited to MDM hub, Process Server, Active VOS, Provisioning and IDD. Extensive experience in installing and support on Linux and on Cloud platform. Experience in configuration and supporting of high availability multi-node environments. Extensive experience in upgrading, patching to keep the systems up-to date. Experience in working with Database administrator to troubleshoot any problems Experience in code deployment across the environments. Experience in setting up the monitoring process for Informatica tools. Experience in system recovery in the event of system failures. Experience applying Hotfixes to Informatica servers Vinith Ailam Technical Recruiter Phone Email: WAFTS SOLUTIONS INC. 32969 HAMILTON COURT, SUITE 123, FARMINGTON HILLS, MI- 48334. eFax Website :www.waftssolutions.com This email and files transmitted with it are confidential and intended solely for the use of the individual or entity to whom they are addressed. If you have received this email in error please unsubscribe by sending an email to Wafts Solutions | 32969 HAMILTON COURT , SUITE 123, FARMINGTON HILLS, MI 48334 Unsubscribe Update Profile | About Constant Contact Sent by in collaboration with Try email marketing for free today!
                      May-31-20
                      Informatica Administrator MDM  Linthicum Heights, MD
                      Job Title: Informatica Administrator MDM Location: Linthicum, MD Client: GCOM Software Required Skills: Informatica, Linux Job Description: This candidate will support a large initiative for the State of Maryland, supporting the integration of several Health & Human Services agencies. This position will be a long term contract role, 12-24 months in duration. Key Responsibilities: Monitor and coordinate all data system operations, including security procedures, and liaison with infrastructure, security, DevOps, Data Platform and Application team. Ensure that necessary system backups are performed, and storage and rotation of backups is accomplished. Monitor and maintain records of system performance and capacity to arrange vendor services or other actions for reconfiguration and anticipate requirements for system expansion. Assist managers to monitor and comply with State data security requirements Coordinate software development, user training, network management and major/minor software installation, upgrade and patch management. Must demonstrate a broad understanding of client IT environmental issues and solutions and be a recognized expert within the IT industry. Must demonstrate advanced abilities to team and mentor and possess demonstrated excellence in written and verbal communication skills. Qualifications: 4 years Bachelor''s Degree from an accredited college or university with a major in Computer Science, Information Systems, Engineering, Business, or other related scientific or technical discipline. A Master''s Degree is preferred. Product administration experience in RHEL Linux based environment Experience in administering cloud based multi user environment with expertise in planning, designing, building, and implementing IT systems. Experience in installation, configurations and managing Informatica MDM Hub, Service Integration Framework on Informatica MDM platform Thorough understanding of Informatica MDM concepts including, but not limited to: Match Rules Tuning, workflow, IDD etc. Install, Administer, Upgrade and Manage Informatica MDM platform - master data management (previously known as Siperian) including application servers for cluster and high availability needs. Participates in administrative activities related to MDM platform including but not limited to MDM hub, Process Server, Active VOS, Provisioning and IDD. Extensive experience in installing and support on Linux and on Cloud platform. Experience in configuration and supporting of high availability multi-node environments. Extensive experience in upgrading, patching to keep the systems up-to date. Experience in working with Database administrator to troubleshoot any problems Experience in code deployment across the environments. Experience in setting up the monitoring process for Informatica tools. Experience in system recovery in the event of system failures. Experience applying Hotfixes to Informatica servers
                      May-31-20
                      ($) : USD 64 / Hourly / C2C
                      ETL Lead Charlotte, NC. Contract Job Description: Hands on Informatica Power Center experience Hands on PL/SQL experience Oracle database experience Unix experience Leading ETL projects Good SQL experience Experience in scheduling tool ( like Autosys) Good knowledge in DW concepts
                      May-31-20
                      ($) : Market
                      Note: This position can sit remote, but this worker would need to work EST/ CST. Description: This role is unique, this is a needed within GRS Data Governance IT team. Currently in the process of standing up the Informatica Data Governance tools in AWS. Working with Hosting to have the 3 environments stood up. Partnering with Hosting teams internally. Once everything is stood up, the role will move towards driving adaptation. Must have a passion for data governance. Currently have the old tools stood up on prem, once up in the Cloud will decommission on prem. Once this is stood up, Hosting will take over Administration, this team takes over new enhancements based on business need. Understanding how tools can be used to support business needs. Candidate who has experience with Governance is #1 priority and the tools are secondary. There is a small technical aspect with this project and that will be over quickly, it is more important to have someone who drive the implement and adoption. Must be able to drive with the business and communicate with leadership. Must be able to communicate effectively and ''sell product and solution'' We have an employee moving on from the data governance space and we need to back-fill for him. Ideal candidate is a principal level has Informatica Data Governance tool experience. We also have the following list of skills required for this role: Technical Skills: AWS Cloud Formation Template (CFT) S3 – Storage EC2 – Compute Lambda functions Paired/mob programming Programming languages: Java, Bash (Unix), python, JSON and YAML Code automation pipeline configuration (bamboo) Source code control (git using bitbucket) - integrated with Jira is a plus Putty DB Visualizer Oracle SQL Tool Experience: Data Governance Data Catalog Data Quality Glossary Maintenance Experience w/ Informatica tools but Alation, or Collibra would also be helpful.
                      May-30-20
                      ($) : USD 90 / Hourly / C2C
                      Title of Position: Informatica Administrator Location: Pleasanton CA 94588 Duration: Long Term Employment: Contractual Rate: $Negotiable Skills: Strong Administration skill in Informatica Powercenter with HA GRID environment Additional Informatica Skills: IDQ, MM, BG, DVO, PMPC and BDM products and concentrating on minimum power center products/platform we have running GRID. Here is a position we have posted for a senior Informatica FTE in my area, the technical skillsets would be the same, but responsibilities may be lower. Conducts or oversees business-specific projects by applying deep expertise in subject area; promoting adherence to all procedures and policies; developing work plans to meet business priorities and deadlines; determining and carrying out processes and methodologies; coordinating and delegating resources to accomplish organizational goals; partnering internally and externally to make effective business decisions; solving complex problems; escalating issues or risks, as appropriate; monitoring progress and results; recognizing and capitalizing on improvement opportunities; evaluating recommendations made; and influencing the completion of project tasks by others. Practices self-leadership and promotes learning in others by building relationships with cross-functional stakeholders; communicating information and providing advice to drive projects forward; influencing team members within assigned unit; listening and responding to, seeking, and addressing performance feedback; adapting to competing demands and new responsibilities; providing feedback to others, including upward feedback to leadership and mentoring junior team members; creating and executing plans to capitalize on strengths and improve opportunity areas; and adapting to and learning from change, difficulties, and feedback. Develops requirements, or leads a team of IT consultants in the development of requirements for complex or specialized process or system solutions which may span multiple business domains by partnering with stakeholders and appropriate IT teams (for example, Solutions Delivery, Infrastructure, Enterprise Architecture Leverages multiple business requirements gathering methodologies to identify business, functional, and non-functional requirements (for example, SMART) across the enterprise. Leads and oversees the development and documentation of comprehensive business cases to assess the costs, benefits, ROI, and Total Cost of Ownership (TCO) of complex solution proposals. Provides insight, guidance, and recommendations throughout the evolution of applications, systems, and/or processes to a desired future state by maintaining and leveraging a comprehensive understanding of how current processes impact business operations across the enterprise. Maps current state against future state processes. Defines the impact of requirements on upstream and downstream solution components. Provides insight and influence to senior management and business leaders on how to integrate requirements with current systems and business processes across the enterprise. Reviews, evaluates, and prioritizes value gaps and opportunities for process enhancements or efficiencies. Influences solution design by providing insight and consultation at design sessions with IT teams to help translate requirements into workable business solutions. Recommends and advocates for additional data and/or services needed to address key business issues related to process or solutions design. Participates in evaluating third-party vendors as directed. Drives continuous process improvement by leading the development, implementation, and maintenance of standardized tools, templates, and processes across the enterprise. Recommends and advocates for regional and national process improvements which align with sustainable best practices, and the strategic and tactical goals of the business.
                      May-30-20
                      ($) : Market
                      VDart We are a Global Information Technology Services & Workforce Solutions firm headquartered out of Atlanta, GA with presence in US, Canada, MX, UK, Belgium, Japan & India. Founded in 2007, Our team of over 2550+ professionals continually create impact for our customers worldwide in solving complex technology challenges with cutting edge technologies. We specialize in providing the Fortune 1000 companies, niche hard to find skills in technologies including Social, Mobile, Big Data Analytics, Data Sciences, Cyber Security, IoT, Cloud, Machine Learning, and Artificial Intelligence. With delivery centers in the UK, Mexico, Canada, and India, we provide global workforce solutions to our customers covering EMEA, APAC & Americas. VDart is an award-winning organization recognized by Inc 5000 Hall of Fame; Atlanta Business Chronicle*s Fastest Growing Companies; NMSDC*s National Supplier of the Year; Ernst & Young*s Regional Entrepreneur of the Year and more. Sr.Data Scientist (AI/ML) Location: Plano, TX Duration: Long Term Experience $ Qualification: 7-10 years of experience in the industry and at least 5 years of experience in production environments. You can demonstrate a track record in the following areas: Machine learning, especially in the context of commercial products. Python programming, with an extensive knowledge of data processing and ML libraries such as Pandas, Scikit-Learn, Tensorflow, Keras, etc. Distributed data processing using Apache Spark, Apache Hadoop, etc. Experience with popular cloud provider environments (e.g., AWS, Azure, Google cloud) is preferred. Software development methodologies and tools (unit and system testing, code reviews, Git You will research, build, and deploy scalable models based on machine learning, artificial intelligence and statistics Starting from real and impactful use cases, you will explore large datasets to explore descriptive and inferential statistical properties of data. You love playing with data and finding patterns that tell excellent data-stories. You will build insightful and scalable machine learning tools that helps to analyze model features, perform feature selection and interpret model results. Using a large collection of customer datasets, you will evaluate the performance of your models on the product performance and acceptance. You thrive in dynamic environments that require a rare blend of innovation and speed of execution. You pride yourself on your communication and interpersonal skills, a keen eye for extracting signal in the data and telling excellent data-stories. You have an ability to autonomously plan and organize your work assignments based on the objectives of the team. Collaborating with a global team of cloud, network and software engineers, you will guide the integration the algorithms into the production codebase. Key Skills: Data Science, machine learning, artificial intelligence, Pandas, Scikit-Learn, Tensorflow, Keras, Apache Spark, Apache Hadoop Referral Program: Ask our recruiting team about how you can be a part of our referral program. If you refer a candidate with the desired qualifications and your candidate accepts the role, you can earn a generous referral fee. We want to hire the best talent available and are committed to building great teams and partnerships. We are Equal Employment Opportunity Employer. VDart Inc Alpharetta, GA Follow us on Twitter for the hottest positions: @VDart_Jobs Follow us on Twitter: @vdartinc
                      May-30-20
                      Datawarehousing jobs statewise
                      Skill Test
                      • C++ Coding Test - High
                      • C++ Coding Test - Medium
                      • C++ Language
                      • C# ASP.NET MVC
                      • C# ASP.NET MVC
                      Try your first Skill Score test now for FREE
                      Improve your Employability
                      Are your skills in Demand?
                      Try DemandIndex and find out instantly
                      Looking for IT Jobs
                      Full Time - Contract - Consulting
                      IT jobs
                      Create account and get notified in 30 seconds

                                          Second-hand housing

                                          education

                                          culture

                                          Premier League

                                          search for

                                          constellation

                                          image

                                          Variety show

                                          video