Portfolio
Nexus
Role: AI Product, Lead Engineer
Nexus, initially deployed as a WhatsApp agent for "Talk to the City," is a full-stack, AI-powered tool designed to enhance collective discourse and decision-making. By leveraging Large Language Models (LLMs), Nexus facilitates the collection and structuring of qualitative responses, providing policymakers with actionable insights, particularly in underrepresented and low-connectivity communities.
AI-Powered WhatsApp Agent Development: Led the creation of a multi-modal WhatsApp agent using Twilio Business API, OpenAI's Assistant API with GPT-4, and Whisper API, focusing on both text and audio interactions for "Talk to the City."
Heroku Deployment & Optimization: Managed complex deployment strategies on Heroku, optimizing resources, enhancing performance with AWS CloudFront, and implementing advanced security measures to handle peak loads and maintain cost efficiency.
Cost Analysis & Scalability Management: Conducted in-depth cost analysis and maintained scaling strategies to handle API rate limits and deployment challenges, ensuring seamless performance and high availability.
Customized Solutions for Conferences: Developed tailored WhatsApp agents for high-profile conferences, showcasing advanced AI functionalities and ensuring high performance during live demos.
Full-Stack Development & Data Integration: Managed the full lifecycle development, including the implementation of a REST API system and a data aggregation solution using AWS Lambda and Firebase, automating data processing for clear and organized analysis.
Talk To the City
Role: Cloud Engineer
"Talk to the City" is an innovative AI-driven tool designed to enhance collective decision-making by analyzing qualitative data at scale. It empowers policymakers and peace mediators to uncover nuanced opinions and disagreements, providing automatic summaries, visualizations, and reports that improve the transparency and inclusivity of democratic processes.
Cloud Deployment & Docker Integration: Successfully deployed the application on Heroku, integrating AWS CloudFront and Docker to optimize global content delivery, enhance security, and solve complex engineering challenges for reliable performance.
AWS & Heroku Challenges: Resolved a critical AWS internal team error, advancing project deployment capabilities by 80%, and managed complex deployment scenarios, including creating customized Lambda layers to enhance API functionality.
Pipeline Deployment & Bug Resolution: Deployed the project pipeline using Docker, resolving critical bugs and enhancing system reliability through key repository modifications.
Engineer Guidelines: Developed comprehensive deployment guidelines and walkthroughs, optimizing the process for future engineers and ensuring smooth handling of complex deployment steps.
Moral Learning
Role: DevOps Engineer
The "Moral Learning" project aims to align AI systems with human moral virtues by developing methods to represent and reason about moral values within large language models. By integrating psychological research and probabilistic reasoning, the project seeks to create AI that can transparently understand and complement human ethical frameworks.
Enhanced Data Pipeline: Streamlined and optimized the automated data pipeline, improving the efficiency of adding and modifying Qualtrics survey templates for the "Moral Learning" project.
API System Development: Designed and deployed a RESTful API that overcame Prolific's limitations, enabling the distribution of multiple surveys per study, crucial for complex research setups.
Survey Distribution Success: Distributed surveys to 100 participants swiftly using a cost-effective API system, ensuring seamless execution and data collection.
Advanced Data Analysis: Developed and implemented a comprehensive analysis pipeline for Qualtrics survey outputs, utilizing R and ggplot to generate detailed visualizations and insights on annotator performance.
Generative Alpha
Role: AI Developer & Data Engineer
G-Alpha aims to transform financial research by developing an AI agent that simulates human roles in trading and financial advisory. Leveraging advanced AI models and data engineering, this scalable system of AI analysts enhances research with unmatched speed and precision.
AI Agent Development: Contributed to the creation of an AI agent that simulates human roles in quant trading, fundamental trading, and financial advising, using Python, MongoDB, GPT-4, and LangChain. This agent is designed to operate as a remote worker on Wall Street, handling complex financial tasks autonomously.
Data Ingestion Pipeline: Engineered a sophisticated data ingestion pipeline that utilizes the SEC API to extract, classify, and curate a decade's worth of SEC filings, including earnings calls, 10-Ks, 10-Qs, and 8-Ks for US30 stocks. Implemented advanced data preprocessing, normalization, and transformation techniques to produce high-quality datasets for AI training.
LLM Fine-Tuning: Utilized GPT-4 to generate Q&A pairs from the curated datasets and fine-tuned both GPT-4 and LLAMA2 models to enhance their understanding of financial data and reading complexity. This fine-tuning was complemented by advanced prompt engineering techniques, significantly improving the AI's analytical capabilities.
Explomind
Role: Data Scientist - Trading Algorithms
Explomind focuses on pioneering advanced day trading strategies and algorithmic solutions, leveraging state-of-the-art machine learning models and full-stack development to enhance trading platforms with high efficiency and precision.
Adaptive Trading Strategies: Developed and validated day trading strategies using TensorFlow and Scipy, incorporating NLP-based fundamental analysis, hedge fund shadow trading, and web scraping with Selenium and Beautiful Soup.
Full-Stack Development: Led the architecture and deployment of backend and frontend components using Python, Dash, PostgreSQL, and AWS, ensuring optimal platform performance and scalability.
Model Fine-Tuning: Fine-tuned a GPT-4 model for trading insights, designed ETL pipelines, and validated strategies with TensorFlow and Scipy, focusing on scalability and availability using AWS tools.
Custom Indicators: Created platform-independent custom indicators with Python and Scipy, visualized through Plotly and Seaborn, transforming unstructured intraday data for optimal algorithmic execution.
Explomind
Role: Data Science Systems Architect
At Explomind, I architectured the development of an adaptive reinforcement learning system, focusing on a human-in-the-loop approach to enhance AI model performance. I led a global team, designed comprehensive training programs, and implemented robust systems for effective decision-making and performance tracking.
Global Team Leadership: Outsourced and led a diverse international team, mentoring and training them in adaptive reinforcement learning and human-in-the-loop systems. I designed and managed the entire hiring process, including creating job flyers and acting as headhunter.
Comprehensive Training & Guidance: Developed extensive teaching courses and guidelines to ensure the team was well-equipped to execute tasks effectively. I was responsible for overseeing the team's progress, providing continuous support and feedback.
Platform Design: Engineered a platform for executing human-in-the-loop decisions, including the design of pipelines that label AI agents' decisions, ensuring seamless integration and accuracy.
Emergency Protocols & Performance Tools: Created emergency situation alternatives and developed automated performance tracking systems using AWS SES and Lambda, ensuring reliability and efficiency in the project's execution.
Copy Trading System: Designed a copy trading system from scratch using R, MT5, MQL5, and EA's on VPS, and developed a full-stack trading platform for financial NLP and pattern recognition, enabling automated feedback for all team members.
Sentium
Role: Quantitative Analyst
At Sentium, I specialized in developing advanced tools and strategies for high-frequency trading, focusing on the optimization of data management and algorithmic execution to enhance trading accuracy and efficiency.
Custom Indicator Development: Engineered custom indicators using Python, Scipy, Plotly, and Seaborn, optimizing algorithmic execution by effectively managing millions of data points through TimescaleDB sourced from the polygon.io REST API.
Mathematical Spike Detection: Designed a custom mathematical solution for precise spike detection, significantly improving the accuracy of high-frequency trading strategies and enhancing overall trading performance.
AWS Deployment Challenges: Overcame deployment challenges on AWS Lambda by managing Lambda layers with Docker and Cloud9, optimizing libraries, and developing a custom mathematical solution for precise spike detection.
Info Investment
Role: Quantitative Analyst Intern
At INFO Investment, I focused on the development and rigorous testing of algorithmic trading strategies, applying advanced techniques to ensure robust performance and risk management in the Turkish stock market.
Algorithmic Strategy Development: Developed and backtested trading strategies for Turkish stocks using C# and Python. Employed the Monte Carlo method to validate strategies across various market conditions, accounting for execution costs, including commissions and slippage.
Performance Optimization: Enhanced strategy performance by optimizing key metrics such as Sharpe Ratio and Maximum Drawdown. Implemented comprehensive risk management techniques, including position sizing, stop-loss, and diversification, with a strong focus on liquidity in real-time markets.
Kids of Hope
Kids of Hope was born out of determination when UNICEF considered me too young to lead such an initiative. I founded and led a sports camp for 400 Syrian refugee children in Kilis, Southeastern Turkey. Every weekend during high school, I flew to Kilis to ensure the success of this camp, which focused on fostering mental and physical well-being through the joy of sports and teamwork. Witnessing the smiles of these children as they laced up their new, colorful football shoes was the ultimate reward.
Moments of Joy, Bonds of Unity
Knowledge & Contact
Sharing insights in data science, AI, and beyond. Follow along for updates, projects, and more.
Get in touch
Stay Connected
emre@turaninsight.com
© 2024. All rights reserved.