Building Responsible AI Across India and France

Building technology across geographies is often described in terms of market expansion or global reach. For us, expanding from India into the French Tech ecosystem was neither a scaling exercise nor a relocation of operations. It was an intellectual broadening—one that deepened, rather than replaced, what we had already learned in India.
India and France offered distinct, complementary ways of thinking about technology, intelligence, and responsibility. Working across both contexts reshaped how we approach research, product design, and the role of AI in society.
Grounded in India: Scale, Context, and Lived Complexity
India was where our foundational thinking took shape. It is a context defined by scale and not just in population or data volume, but also in diversity of language, environment, infrastructure, and lived experience. Problems rarely present themselves in clean, well-bounded forms. They are layered, interdependent, and shaped by local realities that resist abstraction.
Building in India forces early confrontation with complexity. Systems must operate under uncertainty, adapt to uneven access, and remain legible across wide social and cultural differences. This environment sharpened our understanding that intelligence systems cannot be designed as neutral utilities. They inevitably interact with human behaviour, institutional constraints, and social trust.
India grounded us in the importance of context as a first-order concern. It taught us that relevance often matters more than precision, and that scale amplifies both strengths and weaknesses in system design.
France: Research Rigour and Institutional Perspective
Expanding into France did not dilute these lessons. Instead, it deepened them. The French Tech ecosystem exposed us to a different orientation toward technology that was rooted in research depth, institutional engagement, and long-term thinking. There is a strong tradition of linking technological development with public systems, policy frameworks, and academic inquiry. This creates an environment where questions of responsibility, governance, and societal impact are not peripheral but central.
France strengthened our engagement with research rigour. It reinforced the value of theory, methodology, and interdisciplinary dialogue, particularly between technical disciplines and the humanities. This perspective is essential when building systems that operate in public-facing domains such as health, environment, and culture.
Where India demanded adaptability and pragmatism, France encouraged deliberation and structural thinking. Together, these approaches created a more balanced foundation for our work.
Establishing Ourselves in the French Tech Ecosystem
Our formal establishment within the French Tech ecosystem in 2021 was a critical inflection point in this journey. Entering the ecosystem provided not only geographic expansion, but access to institutional support structures designed to enable long-term, research-led innovation.
Support from initiatives such as the Bourse French Tech played an important role in stabilising early experimentation and allowing us to invest in foundational research rather than short-term productisation. Engagements through platforms like France Digitale Day connected us to a broader community of founders, researchers, and policymakers grappling with similar questions around responsible and public-interest technology.
Participation in international forums such as the High Level Forum (HLF) and the International Innovation Ecosystems Network further expanded our perspective. These spaces emphasised cross-border collaboration, public-private dialogue, and the importance of aligning technological innovation with societal priorities. Collectively, this ecosystem reinforced the idea that meaningful AI development benefits from institutional continuity and shared responsibility.
Expanding the Field of Possibility
Working across India and France also broadened our outlook on where AI can and should be applied. Exposure to cultural institutions, public research bodies, and civic technology initiatives in France opened our thinking around culture tech as a serious domain for preserving meaning, enabling access, and supporting collective sense-making, and not just as entertainment or digitisation.
At the same time, the combination of India’s lived challenges and France’s public-systems perspective reinforced the importance of applying AI toward the larger good, particularly in health tech and climate tech. These are domains where intelligence systems influence behaviour, risk perception, and long-term resilience.
Across both contexts, one insight became clear:
Technical capability without ethical orientation and contextual grounding is insufficient. AI must be designed with an explicit awareness of who it serves, how it shapes decisions, and what futures it makes more likely.
Toward a Shared Practice
Building across India and France was not about choosing between two models. It was about learning to hold both scale and rigour, context and structure, lived complexity and long-term stewardship.
This synthesis informs the work we are now doing at Bhaskar Labs - a space dedicated to research, incubation, and public engagement around contextual and responsible intelligence. Bhaskar Labs exists precisely because the challenges we are addressing cannot be solved within a single disciplinary or geographic frame.
The questions we are working on around culture, health, climate, and public intelligence require collaboration across borders, disciplines, and institutions. Bhaskar Labs is being built as a place for that collaboration: for researchers, technologists, designers, policy thinkers, and practitioners who are interested in shaping AI that is rigorous, humane, and socially accountable.
If you are working at the intersection of technology, culture, health, or climate, and are interested in contributing to research, incubation, or dialogue, we invite you to engage with us.
The future of intelligence will be shaped not only by where we build, but by how, and with whom, we choose to build it.
