Hi, Greetings from MSRCOSMOS Here is the Job Description for the below Job Opening Job Title: AWS Architect with AWS CDK and Glue Location: Atlanta, GA (Hybrid - 2 days onsite) Type: Contract Job Description · Strong foundation knowledge on software engineering, application programming, solution design. · Knowledge of IAM user, roles, policy, policy boundary. · Knowledge of Identity federation in AWS both SAML and OIDC; IAM Identity Center, Amazon Cognito, STS. · Solution architecture on AWS static content, dynamic content. · Knowledge, skill of various data storage solutions offered by AWS; object storage, block storage, file storage, storage gateway. Special focus on S3 due to extensive use. · Knowledge, skill of AWS RDS; multi-AZ cluster and instance, read replica, RDS proxy. · Knowledge, skill of Amazon DynamoDB; Indexes, DynamoDB streams, DynamoDB accelerator, DynamoDB capacity modes. · Understanding of data analysis, verification of data quality, ensure data consistency by using AWS services. · Ability to carry out comparative study on AWS services to address specific use cases e.g., data ingestion from Salesforce AWS Glue vs Amazon AppFlow · Knowledge, skill to design solution around Homogeneous, Heterogeneous data ingestion pattern. Special focus on DMS, AWS Transfer family, Data sync. · Implementing data transformation services based on requirements e.g., AWS Glue, Lambda, Amazon Redshift. · Orchestration services to build workflows for data ETL pipelines e.g., Lambda, EventBridge, · Strong knowledge of AWS services used in monitoring, audit, governance. · AWS CDK Knowledge
Vignesh R’s Post
More Relevant Posts
-
Hello Connections, There is an Urgent Requirement on AWS Solution Architect with one of our client. Role: AWS Solution Architect Location: Indianapolis, IN 46204 (Hybrid) Note: Locals Only Duration: 6+ Months Job Description: Key Responsibilities: · Solution Design: Collaborate with stakeholders to understand business and technical requirements, and design scalable, cost-effective solutions using AWS services. · Architectural Planning: Create detailed architectural diagrams and documentation to outline the structure of AWS solutions, including network architecture, data storage, and computing resources. · Service Selection: Recommend and select appropriate AWS services to meet the specific needs of each project, considering factors like performance, cost, and security. · Infrastructure Design and Deployment: Plan, implement, and maintain AWS infrastructure, including but not limited to virtual private clouds (VPCs), subnets, security groups, IAM roles, and networking components. · Application Deployment: Facilitate the deployment of applications on AWS, using services like Amazon EC2, Lambda, Elastic Beanstalk, and containers (ECS/EKS). · Infrastructure as Code (IaC): Implement infrastructure automation and manage AWS resources using tools like AWS CloudFormation or other industry proven. · Security: Ensure the security and compliance of AWS solutions by implementing best practices, encryption, and identity and access management (IAM) controls. · Cost Optimization: Optimize AWS costs by selecting cost-effective services and architectures, implementing cost monitoring, and providing recommendations for cost reduction. · Performance and Scalability: Design solutions that can scale horizontally and vertically to accommodate growing workloads and demand. · High Availability and Disaster Recovery: Design solutions with high availability and disaster recovery in mind, using features like AWS Availability Zones and backup strategies. · Monitoring and Optimization: Set up monitoring and alerting systems to proactively identify and address performance and availability issues. · Technical Leadership: Collaborate with development and operations teams to guide the implementation of AWS solutions and provide technical leadership. · Customer Engagement: Act as a trusted advisor to customers, provide technical expertise, and assist in decision-making related to AWS services. · AWS Glue: Oversight and help AWS Glue developer for designing and implementing data integration, transformation, and ETL (Extract, Transform, Load) workflows using AWS Glue along with other AWS services. · Vendor Management: Work with the IDOH vendor who will be responsible to in completion of migration of on-premise applications and databases to AWS environment. Interested please share resumes to charan@anveta.com
To view or add a comment, sign in
-
Need Senior Candidates only. 100% Remote Only for W2 & 1099 Senior Cloud Architect Responsibilities: Evangelize and innovate to ensure the Platform is using the most reasonable & and effective cutting-edge technologies Evangelize microservice-based architecture using containerized applications. Experience strategizing? on-prem to cloud transformation for large-scale applications Design and implement solutions that span GCP/GKE, CI/CD, monitoring, and security Collaborate and pair with the Platform team to ensure knowledge and expertise is shared & developed Stay current on industry trends; innovate through research, proof of concepts, and demos Shared responsibility for 24x7 research and resolution of production system problems through participation in an on-call rotation Support a “security first” advocacy and encourage platform solutions that enable the microservice product teams to “shift left” Ensure compliance with applicable security and privacy standards and regulations Participates in the recruitment of team members both employees and vendor employees Training and mentoring peers and management Technical Skills/Experience: Cloud: Google (preferred), AWS, Azure GCP tools: Cloud Data Fusion, Vertex AI, Dataflow, Pub/Sub Orchestration: GKE (preferred), AKS, EKS CI/CD: Jenkins, Harness Monitoring: Prometheus/Grafana (preferred), Stackdriver Database: Mongo, Cloud SQL Service Mesh: Istio API: Cloud Endpoints, Apigee Language/scripting: Python (preferred), Bash, Node.js Infrastructure as Code: Terraform EDW: BigQuery Data analysis: Splunk Security: PCI compliance, Prisma Collaboration/Issue tracking: Jira, Confluence Testing: Automation, System, Performance Agile – Scrum, Kanban Tools – Jira, Confluence Methodology: Agile Scrum, Kanban Who might be interested drop the resumes mail ID: sujan.a@vuesol.com hashtag #gcpcloud hashtag #cloudarchitecture hashtag #awsdevops hashtag #database hashtag #cicd hashtag #w2 hashtag #scrum hashtag #agile hashtag #tools hashtag #methodologies hashtag #jenkins hashtag #hiring hashtag #infrastructure hashtag #testing hashtag #collaboration hashtag #dataanalysis hashtag #edw hashtag #languagemodels hashtag #azurecloudengineer hashtag #google
To view or add a comment, sign in
-
-
I am sending you the requirement for Azure Architect position. The client prefers the candidate to come to the office in a hybrid role. Candidate should have strong working knowledge of Azure Data Factory. Azure Architect, you will be responsible for leading the design and implementation of cloud solutions on the Microsoft Azure platform. Resource will be responsible for leading the design and implementation of cloud solutions on the Microsoft Azure platform. You will collaborate with various teams to understand business requirements and develop robust architectures that align with best practices and security standards. Responsibilities: Design and architect scalable, reliable, and cost-effective cloud solutions on Microsoft Azure to meet the organization's business needs and technical requirements. Plan, implement, and maintain Azure infrastructure, including virtual networks (VNets), subnets, network security groups (NSGs), and Azure Active Directory (Azure AD) integration. Facilitate the deployment of applications and services on Azure, using services like Azure App Services, Azure Functions, and Azure Kubernetes Service (AKS). Strong experience in Azure Data Factory. Design and implement data storage and management solutions using Azure SQL Database, Azure Cosmos DB, Azure Blob Storage, and other relevant Azure data services. Ensure that Azure environments and applications are secure and compliant with industry standards. Implement identity and access management, encryption, and network security measures. Monitor Azure resources' performance and identify opportunities for optimization, scaling, and performance improvements. Design and implement solutions for high availability, fault tolerance, and disaster recovery using Azure Availability Zones, Azure Site Recovery, and other relevant services. Optimize Azure resource usage and provide cost-saving recommendations to the organization. Create detailed technical documentation, including architecture diagrams, operational procedures, and guidelines for Azure services usage. Leverage Azure Data Factory's capabilities to build efficient data pipelines for data ingestion, transformation, and loading. Integrate Azure Data Factory pipelines with Azure SQL scripts and REST APIs to automate data workflows and enhance data processing capabilities. Collaborate with cross-functional teams, including developers, operations, and project managers, to ensure successful project delivery. venkat@mercurysoftsol.com Visa: GC/USC/H1 transfer #azurearchitect #dataengineer #hybrid #datafactory #
To view or add a comment, sign in
-
🔒 Navigating AWS IAM for Data Engineers 🔒 Data engineers play a pivotal role in shaping how organizations leverage their data, but with great power comes great responsibility. Enter AWS IAM (Identity and Access Management), your key ally for securing and managing data in the AWS cloud. Here's what you need to know: 🔑 Users: In the world of IAM, users represent individuals or applications that interact with AWS resources. As a data engineer, you have your unique set of credentials for accessing data-related services and tools. 🔑 Roles: IAM roles are your best friend when it comes to granting permissions and access to AWS resources. Imagine needing to pull data from an S3 bucket or run EMR clusters – roles are your bridge to these capabilities, ensuring seamless, controlled access. 🔑 Groups: Groups are collections of users and permissions assigned to those users. Groups provide a convenient way to manage permissions for users with similar needs by categorizing them according to their job role, department, or any other requirement. 🔑 Policy: Policies define the rules of the game. As a data engineer, you'll work with policies to specify who can access your data pipelines, databases, or any other data resources. AWS offers managed policies for common use cases or you can craft custom policies for fine-tuned control. AWS IAM is your security blanket for ensuring that data is accessed and processed securely, adhering to compliance requirements. For data engineers, it's your key to orchestrating, transforming, and delivering data in a controlled and efficient manner. Stay data-driven, and stay secure! 🚀📊 #DataEngineering #AWS #IAM #DataSecurity
To view or add a comment, sign in
-
Hi connection, We have openings on one of our client. Role: AWS Architect Experience: 6+years Detailed JD: • Design, implement and maintain all AWS infrastructure and services within a managed service environment. • Work closely with application, network, and security teams to ensure requirements are reflected appropriately in the AWS design • Design, Deploy and maintain enterprise class security, network, and systems management applications within an AWS environment • Implement process and quality improvements through task automation. Institute infrastructure as code, security automation and automation or routine maintenance tasks • Provide technical solution and consulting to the application teams for the migration approach. • Work with engineering teams to introduce new AWS services • Support the business development lifecycle (Business Development, Capture, Solution Architect next migration path, cost reporting and impartments) • Provide integration support and advance troubleshooting for development and production environments. • Work closely with a team of architects, engineers, and developers to create functional design specifications, AWS reference architectures, and assist with other project deliverables as needed. • Review findings and Identify root causes for common issues and provide recommendations for sustainable improvement Required Skills: • Terraform experience is a must • Focusing on high availability, fault tolerance, and auto-scaling using Terraform. • Configured and managed various AWS Services including EC2, RDS, VPC, S3, Glacier, Cloud Watch, Cloud Front, Route 53 etc. • Configured various performance metrics using AWS Cloud watch & Cloud Trial • Worked on configuring Cross-Account deployments using AWS Code Pipeline, Code Build and Code Deploy by creating Cross-Account Policies & Roles on IAM. • Written various Lambda services for automating the functionality on the Cloud. • Used AWS Route 53 for configuring the High-Availability and Disaster recovery to make the environment up and running in case of any unexpected disaster. • Maintained the user accounts (IAM), RDS, Route 53, VPC, RDB, Dynamo DB, SES, SQS and SNS services in AWS cloud. • Experience with microservice architecture • Above-average skills in Python or another high-level programming language • Working knowledge of Containers and container orchestration platforms • Deep knowledge of configuration management, Network connectivity, deployment automation tools on AWS • Deep understanding of various security aspects • Experience and exposure to design reviews for the introduction of new technologies Note: Terraform experience is a must Interested Candidates can share your cv to kathineni.sravani@apekshinfotech.com #aws #awsarchitect #terraform #ec2 #rds #vpc #dynamodb #ses #awscloud #python
To view or add a comment, sign in
-
ALTA IT Services is #hiring an Architect III for #hybrid work in Reston, VA. Qualifications include: 🔍 Expertise in relational and NoSQL DBs. 🛠️ AWS Data Migration Service & Test Data management. ☁️ Cloud migration & microservices architecture. 💻 Skilled in AWS, development, and networking. 🛠️ Experience with JIRA and Confluence. Learn more and apply today: https://ow.ly/Z8bx50QuW4P #ALTAIT #ITJobs #RestonVA #DatabaseExpert #AWSMigration #CloudArchitecture #Microservices #AWSskills #JIRAExperience #ConfluenceExpert
To view or add a comment, sign in
-
-
Hello Associate, Hope you are doing well We have the below requirement open. Please send me your genuine candidate on my email ID karthik@ritwikinfotech.com Position : IDOH Azure Architect Location : Indianapolis, IN 46204 (Web Cam Interview Only) Duration : Long Term Local candidates: Hybrid, 3 days onsite Out of State candidates: 100% remote. JOB DESCRIPTION: Resource will be responsible for leading the design and implementation of cloud solutions on the Microsoft Azure platform. You will collaborate with various teams to understand business requirements and develop robust architectures that align with best practices and security standards. Responsibilities: • Cloud Solution Design: Design and architect scalable, reliable, and cost-effective cloud solutions on Microsoft Azure to meet the organization's business needs and technical requirements. • Infrastructure Design and Deployment: Plan, implement, and maintain Azure infrastructure, including virtual networks (VNets), subnets, network security groups (NSGs), and Azure Active Directory (Azure AD) integration. • Application Deployment: Facilitate the deployment of applications and services on Azure, using services like Azure App Services, Azure Functions, and Azure Kubernetes Service (AKS). • Data Management: Design and implement data storage and management solutions using Azure SQL Database, Azure Cosmos DB, Azure Blob Storage, and other relevant Azure data services. • Security and Compliance: Ensure that Azure environments and applications are secure and compliant with industry standards. Implement identity and access management, encryption, and network security measures. • Performance Optimization: Monitor Azure resources' performance and identify opportunities for optimization, scaling, and performance improvements. • High Availability and Disaster Recovery: Design and implement solutions for high availability, fault tolerance, and disaster recovery using Azure Availability Zones, Azure Site Recovery, and other relevant services. • Cost Optimization: Optimize Azure resource usage and provide cost-saving recommendations to the organization. • Documentation: Create detailed technical documentation, including architecture diagrams, operational procedures, and guidelines for Azure services usage. • Team Collaboration: Collaborate with cross-functional teams, including developers, operations, and project managers, to ensure successful project delivery.
To view or add a comment, sign in
-
#hiring *AWS Solution Architect( Onsite)*, Plano, *United States*, fulltime #jobs #jobseekers #careers #Planojobs #Texasjobs #ITCommunications *Apply*: https://lnkd.in/gJv85c3N About AI & Analytics: Artificial intelligence (AI) and the data it collects and analyzes will soon sit at the core of all intelligent, human-centric businesses. By decoding customer needs, preferences, and behaviors, our clients can understand exactly what services, products, and experiences their consumers need. Within AI & Analytics, we work to design the future-a future in which trial-and-error business decisions have been replaced by informed choices and data-supported strategies.By applying AI and data science, we help leading companies to prototype, refine, validate, and scale their AI and analytics products and delivery models. Cognizant's AIA practice takes insights that are buried in data, and provides businesses a clear way to transform how they source, interpret and consume their information. Our clients need flexible data structures and a streamlined data architecture that quickly turns data resources into informative, meaningful intelligenceResponsibilities:Technology architecture & implementation experience with deep implementation experience with Data solutions years of experience in Data Engineering and 5+ Years Data Engineering Experience on cloud data engineeringTechnology pre sales experience - Architecture, Effort sizing , Estimation and Solution defenseData architecture patterns- Data Warehouse , Data Lake , Data Mesh , Lake house , Data as a product Develop or Co-develop proofs of concept and prototypes with customer teamsExcellent understanding of distributed computing fundamentalsExperience working with one or more major cloud vendors Deep expertise on End to End Pipeline ( or ETL) development following best practices and including orchestration, Optimization of Data pipelines Strong understanding of the full CI/CD lifecycleLarge legacy migration ( Hadoop , Terdata like) experience to Cloud Data platformsExpert level proficiency in engineering & optimizing with various data engineering ingestion patterns - Batch, Micro Batch, Streaming and API Understand imperatives of change data capture with tools & best practices POVArchitect and Solution Data Governance capability pillars supporting modern data eco systemData services and various consumption archetypes including semantic layers, BI tools and AI&MLThought leadership designing self-service data engineering platforms & solutions Ability to engage and offer differing points of view to customers architecture using AWS and Snowflake platformStrong understanding of the Snowflake platform including evolving services like SnowparkImplementation expertise using AWS services - EMR , Redshift , Glue , Kinesis , Lambda, AWS Lake formation and SnowflakeSecurity design and implementation on AWS & SnowflakePipelines development in multi-hop pipeline a
To view or add a comment, sign in
Bench sales Recruiter at intuites
1monavya@intuites.com