█
> Junior Data Scientist & Data Engineer
About Me
I began with a Bachelor's and Master's in Philosophy, then pivoted into data engineering at BeCode, where I'm building data pipelines and managing processes with advanced tools and languages. My diverse background brings a creative edge to technical challenges.
Education
Master's in Philosophy
Istanbul MSGSU, Turkey (2020-2022)
Bachelor of Philosophy
Ankara YBU, Turkey (2013-2018)
Minors: Psychology, Sociology
Languages
Interests
Work Authorization
Technical Skills
My technical expertise spans across data engineering, data science, and cloud technologies, with hands-on experience in building scalable data solutions.
Programming & Data
Data Engineering
Cloud & Tools
Data Science
Advanced Tools
Competencies
Professional Experience
My journey from philosophy to data engineering, enriched by diverse experiences in education, volunteering, and hands-on technical training.
Junior Data Engineer Intern
MinersAI
Currently working on advanced data engineering solutions for geological data processing and AI integration.
ETL Pipeline Development: Design, optimization, extraction, transformation, and loading of geological data
LLM Integration: Translation of unstructured geological text into structured formats
OCR & LLM for Raster Data: Data extraction from images/maps in geological PDFs
Feature Integration: Continuous workflow optimization and feature implementation
Geospatial & Vector Data Structuring: Structuring spatial/vector data for AI/ML analysis
Data Engineer Trainee
BeCode
Intensive data engineering bootcamp focusing on practical skills and hands-on projects.
Mastered data pipelines, ETL processes, and cloud platforms through active learning
Built end-to-end data solutions using modern tools and technologies
Developed expertise in Python, SQL, and data engineering frameworks
Collaborated on real-world projects fostering practical skills and continuous growth
Gained proficiency in cloud services (Azure, AWS) and containerization (Docker)
P4C Instructor/Mentor
Private Educational Institution
Teaching critical thinking through Philosophy for Children (P4C) methodology.
Designed and delivered philosophy curricula for students of various age groups
Developed critical thinking and analytical skills in students through Socratic dialogue
Created engaging educational materials and interactive learning experiences
Mentored students in developing logical reasoning and ethical thinking
Fostered inclusive classroom environments promoting intellectual curiosity
International Volunteer
European Solidarity Corps (ESC)
Animator organizing projects based on social inclusion and integration for diverse communities.
Organized and facilitated community integration events for international participants
Developed intercultural communication skills while working with diverse groups
Created educational workshops focused on social inclusion and cultural exchange
Coordinated logistics for large-scale community projects and events
Built strong networks within international volunteer communities
Featured Projects
A showcase of my data engineering and data science projects, demonstrating practical application of modern technologies and best practices.
Geological Data ETL Pipeline
Advanced ETL pipeline for processing and structuring geological data from multiple sources, integrating OCR and LLM technologies for unstructured data extraction.
Key Features:
- Automated geological data extraction from PDFs and images
- LLM integration for text-to-structured data conversion
- Scalable pipeline architecture with error handling
- Real-time data processing and validation
Geospatial Data Processing System
Comprehensive system for processing and analyzing geospatial and vector data, optimized for AI/ML analysis workflows in geological applications.
Key Features:
- Advanced geospatial data structuring and cleaning
- Vector data optimization for machine learning models
- Interactive mapping and visualization capabilities
- Integration with AI/ML prediction models
Automated Data Pipeline Framework
Robust framework for building and managing automated data pipelines with monitoring, logging, and scalable architecture using modern orchestration tools.
Key Features:
- Automated workflow orchestration with Airflow
- Containerized deployment with Docker
- Comprehensive monitoring and alerting system
- Modular and reusable pipeline components
Cloud Data Warehouse Solution
Scalable data warehouse implementation using Snowflake and DBT for efficient data transformation and analytics workflows.
Key Features:
- Optimized data modeling and transformation logic
- Automated testing and deployment processes
- Performance optimization and cost management
- Integration with business intelligence tools
Machine Learning Model Pipeline
End-to-end ML pipeline for data preprocessing, model training, evaluation, and deployment with automated retraining capabilities.
Key Features:
- Automated feature engineering and selection
- Model training with hyperparameter optimization
- Comprehensive model evaluation and validation
- Automated model deployment and monitoring
Web Scraping & Data Collection Framework
Scalable web scraping framework for automated data collection from various sources with data quality checks and storage optimization.
Key Features:
- Multi-source data extraction capabilities
- Robust error handling and retry mechanisms
- Data quality validation and cleaning
- Efficient storage and indexing strategies
Interested in Collaboration?
I'm always open to discussing new projects and opportunities in data engineering and data science.
Let's ConnectGet In Touch
I'm currently seeking opportunities as a Junior Data Engineer. Let's discuss how we can work together to build innovative data solutions.
Send a Message
Ready to Start a Conversation?
Whether you have a project in mind, want to discuss opportunities, or just want to connect, I'd love to hear from you.