YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

๐Ÿ•‰๏ธ DeepShiva - AI Travel Companion for Indian Tourism & Wellness

Your intelligent guide to India's spiritual and cultural heritage

๐ŸŒŸ Overview

DeepShiva is a specialized AI model designed to bridge the gap between modern travelers and India's rich spiritual traditions. Built on the robust foundation of Llama 8B, this model serves as your personal companion for exploring Indian tourism, wellness practices, yoga, Ayurveda, and ancient wisdom.

DeepShiva addresses these challenges by providing culturally-informed, spiritually-aware AI assistance that respects and preserves traditional knowledge while making it accessible to modern practitioners.

๐Ÿ”ง Technical Specifications

  • Base Model: Llama 8B (8.03B parameters)
  • Fine-tuning Method: QLoRA (Quantized Low-Rank Adaptation)
  • Training Type: Unsupervised Fine-tuning
  • Architecture: Transformer-based with specialized Indian cultural knowledge
  • Hardware: Trained on AMD MI300 GPU
  • Model Size: 8.03B parameters
  • Quantization: 4-bit optimization for efficient deployment

๐Ÿ“š Training Datasets

The model was fine-tuned on carefully curated datasets focusing on Indian spiritual and cultural knowledge:

  1. Sanskrit-llama (diabolic6045/Sanskrit-llama)

    • Ancient Sanskrit texts and translations
    • Foundational spiritual literature
  2. Bhagavad Gita (OEvortex/Bhagavad_Gita)

    • Complete Bhagavad Gita with commentary
    • Philosophical discussions and interpretations
  3. Ramayana (Naman0001/Ramayan)

    • Epic narratives and moral teachings
    • Cultural values and traditional stories

These datasets ensure the model has deep understanding of:

  • Sanskrit terminology and concepts
  • Hindu philosophy and spirituality
  • Traditional Indian values and practices
  • Cultural context for modern applications

๐ŸŽฎ Try the Model

Experience DeepShiva through our interactive web interface:

  • Live Demo: Try our Fine-tuned Model
  • Hugging Face Space: Available for direct model interaction
  • API Access: Available through Hugging Face Inference API

๐Ÿƒโ€โ™‚๏ธ Quick Start

from transformers import AutoTokenizer, AutoModelForCausalLM
import torch

# Load the model and tokenizer
model_name = "Riddhish121/Indian-cluture-deepshiva"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(
    model_name,
    torch_dtype=torch.float16,
    device_map="auto"
)

# Example usage
prompt = "Guide me through a traditional yoga practice for beginners"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_length=200, temperature=0.7)
response = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(response)

๐ŸŒ Use Cases

  • Spiritual Tourism: Plan authentic spiritual journeys across India
  • Wellness Coaching: Get personalized Ayurvedic and wellness advice
  • Yoga Practice: Receive guidance on traditional yoga and meditation
  • Cultural Education: Learn about Indian philosophy and traditions
  • Travel Planning: Discover cultural and spiritual destinations
  • Personal Growth: Integrate ancient wisdom into modern life

๐Ÿ“Š Model Performance

  • Specialized Knowledge: Optimized for Indian cultural and spiritual content
  • Contextual Understanding: Deep comprehension of Sanskrit terms and concepts
  • Cultural Sensitivity: Respectful representation of traditional practices
  • Practical Guidance: Actionable advice for modern practitioners

๐Ÿ”„ Model Updates

  • Version: Latest stable release
  • Last Updated: [Current Date]
  • Downloads: 6 downloads in the last month
  • Community: Growing user base of spiritual seekers and cultural enthusiasts

๐Ÿค Community & Support

Join our community of practitioners and developers:

  • Share your experiences with DeepShiva
  • Contribute to model improvement
  • Request new features and capabilities
  • Connect with like-minded spiritual seekers

๐Ÿ“„ License

This model is released under the MIT License, promoting open access to spiritual and cultural knowledge while respecting traditional wisdom.

๐Ÿ™ Acknowledgments

We express our gratitude to:

  • The creators of the Sanskrit, Bhagavad Gita, and Ramayana datasets
  • The open-source community for foundational tools
  • Traditional teachers and spiritual practitioners who preserve this knowledge
  • AMD for providing MI300 GPU resources for training

DeepShiva - Where Ancient Wisdom Meets Modern AI ๐Ÿ•‰๏ธ

Downloads last month
8
GGUF
Model size
8B params
Architecture
llama
Hardware compatibility
Log In to view the estimation

4-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ 1 Ask for provider support