Big Data Developer
In a nutshell
Sainsbury's is a data rich business - with information collected from consumers, suppliers and colleagues. We believe that there is huge value in these data in driving value for our customers and, in turn, value for our shareholders.
To-date, there has been some great work done which has started to realise some of this value, but there is significantly more opportunity bought about by developing the right information solutions.
This role will undertake data transformation and solution development utilising appropriate models, process, standard and principles to make great quality, well understood data available to drive analytics, reporting and insights into the business.
The role will involve working across a mix of traditional RDBMS, Big Data and new open source platforms with a range of tools and technologies.
What I need to do
Work in a Best Practice way
- Utilise the appropriate development, data engineering (ETL & ELT, stream processing), test and release model(s) to best fit the needs of the company
- Undertake a mixture of data ingestion and data transformation into target data models
- Plan and organise your time effectively to deliver tasks
- Have flexibility to undertake standard solution development as well as support rapid discovery, prototyping and data science, so that successful variants can be turned into production solutions
- Develop through agreed demand and delivery processes, principles and standards
- Use your knowledge and experience to address issues, provide solutions and suggest ways to further improve and enhance our delivery model
- Optimise code and suggest configuration changes to improve performance
- Convert algorithms, models and features created by data scientists and analysts from prototypes into production solutions
- Update and manage artefacts created by the team
- Deliver, cost effective, appropriate capabilities for the business.
- Develop and utilise your existing network within and outside of the open source community to share, learn, develop new and enhance existing capabilities
- Work in a matrix manner with D&T (IT) and DACE colleagues to deliver end to end solutions in a joined way - clear communication and great teamwork is key
- Where appropriate, look to remove technical debt and deliver in repeatable, re-usable patterns
Deliver great solutions
- Be a great team player working with others to create value rapidly, while delivering strategic long term solutions
- Effectively manage your own time and responsibilities to ensure deadlines are met and dependencies managed
- Be an expert in your craft and develop great optimised code
- Prioritise and manage workload to deliver against demand; complete tasks to achieve appropriate sprint burn down
- Choose the appropriate toolset from our existing capabilities for the task in question
- Work to deliver value to the business in an iterative manner through 2-3 week sprint windows
- Transform data from raw form into appropriate storage, information and presentation layers that enable analysts, data scientists and report writers to add value
- Look for ways to continuously improve what we do and how we work
- Focus your work on value creation from data across the enterprise
- Deliver change into our various data assets in a timely and cost-effective way
- Ensure solutions are appropriately documented and prepared for release and handover for operational support
- Developing a great relationship with D&T (IT), peers and the end user community.
- Delivering great solutions
- Be inquisitive
- Being focused on driving value back into the business
- Managing a complex mix of work within tight timescales and budgets in an agile way that ensures maximising our value opportunity (time to value).
- Be pro-active, suggest new approaches and drive continuous improvement
- Share what you are good at while learning from others to improve the team overall
- Be a subject matter expert of all traditional and Big Data capabilities and understand which solutions to use in what context for the best outcome.
- Strive to become a luminary in data development
- Clearly a detailed knowledge of the tools, technologies, skills and processes required to deliver a complex and complete development solutions for the business.
- Detailed understanding of development of data stores and data warehouses and associated toolsets.
- Detailed knowledge of data development in a big data (Hadoop) and/or traditional data warehouses/toolsets using SQL or SQL based ETL capabilities
- Technology stack includes Cloudera, Spark, Hive, Impala, Data Formats, Kafka, Python, Scala, Teradata, AbInitio, WhereScape, SQL server etc.
- An understanding of how to do development in the cloud (AWS a plus), and the services available
- Experience of working in digital data teams using Adobe or related technologies a bonus
- A good understanding of Information Architecture and solution design
- A strong user and advocate of continuous integration (CI, Jenkins a plus), continuous deployment and test driven development
- A deep understanding and experience of using a mix of delivery methodologies including Agile, Scrum and Waterfall delivery
- A broad understanding of the applications of data from reporting through to data science
- Exemplary technical skills
- The passion, drive and commitment to succeed in a fast-moving, highly pressured environment
- Flexibility of approach
- A keenness to both learn and share your knowledge with others
- The ability to deliver quality solutions at an appropriate pace
- A proven track record of delivering solutions in a large scale, complex business
- People - an appropriately sized and skilled team to work with
Toolsets - A strong mix of more traditional and new big data, open source capabilities
- A wholly engaged business - keen to have 'data sorted' and equally keen to help
- The access to an appropriate community delivering in a matrix manner
- Personal development
- Career progression goals