BI and Reporting Developer

This position will apply subject matter expertise to source, analyse and validate data required from internal and external sources, as part of the Business Intelligence & Data Warehousing project. This role will be responsible for modelling, structuring and aggregating data in an efficient, accurate and customer relevant manner, using the following technologies: BIRST, AWS and Microsoft Azure. Extensive Knowledge of designing and implementing dashboards using BIRST and Looker is mandatory.

Education and Skills

  • Bachelors degree or higher in an Engineering, IT, Math or Science related field.

  • 6+ years of experience in which a minimum of
  • 3+ years in SQL / AWS Redshift
  • 2+ years of experience in building dashboards using tools like BIRST and Looker

  • Hands on DataWarehouse practitioner, with broad understanding of big data ecosystems, distributed computing and analytics in a public cloud environment (Nice to have)

  • Ability to rapidly build ETL/ELT data processing jobs using combination of SQL and programming/scripting languages (eg. python)

Core Competencies

  • Strong proficiency with SQL and its variations (postgresql) among popular databases
  • Strong database fundamentals including SQL performance and schema design
  • Knowledge of best practices when dealing with cloud databases; AWS Redshift and Azure
  • Capable of troubleshooting common database issues
  • Translating functional and technical requirements into detailed design
  • Coding and testing complex system components
  • Code and design reviews to maintain high development standards
  • Data Analysis experience
  • Strong skills in performance tuning of complex SQL queries, procedure and indexing strategies
  • Extensive Knowledge of designing and implementing dashboards using BIRST and Looker


  • Design OLAP databases using data warehouse patterns and schemas including facts, dimensions, sorting keys, indexes, constraints etc.
  • Query design and performance tuning of complex queries for very large data sets
  • Working knowledge of AWS console and AWS Services with S3 storage
  • AWS Pipeline knowledge to develop ETL for data movement to Redshift
  • Programming skills in Java, Python or similar

Roles & Responsibilities


  • Identify and understand business needs to effectively design, develop, test, and deploy reports or applications
  • Stress test and optimize the system for efficiency in a distributed database
  • Build custom data imports as required
  • Demonstrate SQL best practices in regard to security and privacy as well as design and create SQL views, Common Table Expressions (CTEs) and stored procedures to combine / aggregate data required for reporting purposes
  • Be responsive and timely with ad-hoc data requests
  • Be an engaged and productive member of the development team

Non Technical

  • Excellent interpersonal, analytical, and problem-solving skills
  • Excellent verbal and written communication skills
  • Ability to learn quickly and ‘get things done’
  • Ability to multi-task in a fast-paced environment
  • Motivated to share knowledge in a team environment
  • Self-directed and able to prioritize own workload
  • Mentor junior team members as needed
  • Work with data architects to ensure that solutions are aligned with company-wide technology directions

How to apply

To apply to this role please email us at '' with your CV and the name of the position you are applying for and we will get back to you as soon as possible