I am a passionate software developer and data analytics enthusiast ready to elevate your next project. With expertise in algorithms, machine learning, and visualization, I bring a unique blend of technical powers and creative problem-solving to the table. Let's collaborate and turn your vision into reality!
Download CV
I conducted a comprehensive 10-year data analysis, for over 50 products using Excel and Python. I analyzed Return on Ad Spend (ROAS) across various marketing channels to gauge campaign effectiveness. Additionally, I managed and organized a significant volume of data across multiple products, maintaining over 100 Excel files. Excel served as a primary tool for data-driven decision-making, contributing to strategic insights.
I collected and integrated data from various systems to build a secure centralized data warehouse. Identified key performance indicators (KPIs) for higher education decision-making. Designed a comprehensive University administrative data model with over 1500 entities using LucidCharts and SQL. Actively participated in project management meetings, contributing to strategic decisions.
I transformed legacy jQuery pages into custom reports using Python, Java OOPs, and SQL queries, reducing load time by 10% and enhancing efficiency for over 100,000 enterprise customers. I optimized the production database, doubling report generation speed by modifying over 25 SQL procedures. Leading test planning and bug reporting, I collaborated with cross-functional teams to gather feedback and ensure project success.
I engineered a UNET model achieving 95-97% accuracy in binary and multi-class aerial image segmentation using Python. I optimized ArcGIS efficiency by 30% with Python scripts. Applied data analysis and NLP techniques to process over 1000 tweets per batch for space research using Twint & Scraping tools. Implemented real-time data streaming for immediate analysis of 10,000+ records from the GDELT event database.
I utilized Selenium and Python to seamlessly scrape and integrate data with PostgreSQL. I developed a Python script to scrape information about Indian estates from 20 websites, totaling over 5000 records, and enhanced accuracy using the Geocoding API for precise locations. The integrated data, enriched with geolocations, was efficiently stored and managed in PostgreSQL, optimizing estate record management.