We use cookies. Find out more about it here. By continuing to browse this site you are agreeing to our use of cookies.
#alert
Back to search results

Principal Data Engineer

Verizon
remote work
United States, Massachusetts, Boston
Nov 18, 2024

When you join Verizon

You want more out of a career. A place to share your ideas freely - even if they're daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love - driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together - lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the V Team Life.

What you'll be doing...

This role is responsible for building end-to-end data products by understanding, ingesting, and curating data. Depending on the need, the product may be built on GCP or other platforms, with an understanding of Teradata and the Hadoop ecosystem. This role requires hands-on experience with SQL, Python, and Google Cloud native tools such as Google BigQuery, Dataflow, Cloud Composer, Pub/Sub, and Google Spanner. You will be....

Developing high-quality code that meets standards and delivers desired functionality using cutting-edge technology.

  • Programming components, developing features, and frameworks.

  • Working independently and contributing to immediate and cross-functional teams.

  • Participating in design discussions and contributing to architectural decisions.

  • Analyzing problem statements, breaking down problems, and providing solutions.

  • Taking ownership of large tasks, delivering on time, and mentoring team members.

  • Exploring alternate technologies and approaches to solve problems and following an agile-based approach to deliver data products.

What we're looking for... You'll need to have:
  • Six or more years of relevant experience required, demonstrated through one or a combination of work and/or military experience, or specialized training.

  • Experience in data warehousing, Data Lakes, and big data platforms.

  • Experience with GCP tools such as Cloud DataFlow, Cloud Shell SDK, Cloud Composer, Google Cloud Storage (GCS) and BigQuery

Even better if you have one or more of the following:
  • Bachelor's or Master's degree in Computer Science, Information Science, Engineering, or a related field with 4+ years of relevant work experience.

  • 4 or more years of proven experience collaborating with data engineers, architects, data scientists, and enterprise platform teams in designing and deploying data products and ML models in production

  • 4 or years experience with GCP tools such as Cloud DataFlow, DataProc, Cloud Shell SDK, Cloud Composer, Google Cloud Storage (GCS), Cloud Functions and BigQuery

  • Experience in designing and deploying Hadoop clusters and various big data analytical tools, including HDFS, PIG, Hive, Sqoop, Spark, and Oozie.

  • Hands-on experience in designing and building data pipelines using Airflow, or Apache Beam in GCP (Dataproc and BigQuery) for ETL jobs.

  • Hands-on experience implementing real-time solutions using Apache Beam SDK with Dataflow, Spark, or Flink as runners.

  • Experience with real-time data streaming technologies such as Google Pub/Sub and Apache Kafka.

  • Experience with advanced transformations and windowing functions in real-time data processing.

  • Experience with Google Dataflow templates for creating reusable and scalable data processing pipelines.

  • Experience working with at least one NoSQL database (HBase, Cassandra, Couchbase) and one relational database (Oracle, MySQL, Teradata).

  • Strong problem-solving skills and the ability to empathize and maintain a positive attitude.

  • Participation in technology and industry forums on evolving data engineering.

  • Strategic thinker with the ability to apply a business mindset to data issues and initiatives, crafting cross-business strategies and plans for multiple stakeholders.

  • Strong leadership, communication, persuasion, and teamwork skills.

  • GCP/AWS Cloud certifications.

  • Familiarity with Large Language Models and a passion for Generative AI tools.

If Verizon and this role sound like a fit for you, we encourage you to apply even if you don't meet every "even better" qualification listed above.

Where you'll be working
In this hybrid role, you'll have a defined work location that includes work from home and a minimum eight assigned office days per month that will be set by your manager. Scheduled Weekly Hours40 Equal Employment Opportunity

We're proud to be an equal opportunity employer - and celebrate our employees' differences, including race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, and Veteran status. At Verizon, we know that diversity makes us stronger. We are committed to a collaborative, inclusive environment that encourages authenticity and fosters a sense of belonging. We strive for everyone to feel valued, connected, and empowered to reach their potential and contribute their best. Check out our diversity and inclusion page to learn more.



Applied = 0

(web-5584d87848-99x5x)