Microsoft introduces Phi-3-mini AI model, cost-effective and versatile
Ars Technica — April 23, 2024, 10:00 PM UTC
Summary: Microsoft introduced Phi-3-mini, a lightweight AI language model with 3.8 billion parameters, simpler and cheaper than large models like GPT-4. Phi-3-mini can run on consumer GPUs and smartphones, unlike models needing data center GPUs. Microsoft plans to release larger versions of Phi-3. Phi-3's performance rivals Mixtral 8x7B and GPT-3.5. It's available on Azure, Hugging Face, and Ollama for local use. Phi-3's innovation lies in its training data and alignment for robustness and safety.
Article metrics
Significance5.3
Scale7.0
Magnitude6.5
Potential7.5
Novelty8.0
Actionability6.0
Immediacy9.0
Positivity7.0
Credibility7.5