-
Notifications
You must be signed in to change notification settings - Fork 1
Expand file tree
/
Copy pathfaq.html
More file actions
154 lines (151 loc) ยท 7.58 KB
/
faq.html
File metadata and controls
154 lines (151 loc) ยท 7.58 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>FAQ - GGUF Loader</title>
<link rel="stylesheet" href="styles.css">
<link rel="stylesheet" href="mobile-fixes.css">
</head>
<body>
<header>
<nav>
<a href="index.html" class="logo">GGUF Loader</a>
<ul>
<li><a href="index.html#features">Features</a></li>
<li><a href="guides.html">Guides</a></li>
<li><a href="faq.html">FAQ</a></li>
<li><a href="need-help.html">Need Help?</a></li>
</ul>
</nav>
</header>
<main class="container">
<h1>Frequently Asked Questions</h1>
<section id="faq" class="faq">
<div class="faq-grid">
<article class="faq-item">
<h3>What is GGUF?</h3>
<p>GGUF (GPT-Generated Unified Format) is an optimized model format created for llama.cpp to
enable fast local inference of large language models.</p>
<div class="faq-links">
<a href="what-is-gguf.html" class="more-info-btn"
aria-label="Complete guide to GGUF format">
๐ Complete GGUF Guide
</a>
<a href="docs/package-structure/" class="more-info-btn"
aria-label="Technical details about GGUF format">
๐ง Technical Details
</a>
</div>
</article>
<article class="faq-item">
<h3>Do I need Python or CLI knowledge?</h3>
<p>No, you don't need Python or command line knowledge. GGUF Loader provides a user-friendly
graphical interface.</p>
<div class="faq-links">
<a href="docs/installation/" class="more-info-btn"
aria-label="Simple installation without Python">
๐ฆ Installation Guide
</a>
<a href="docs/quick-start/" class="more-info-btn" aria-label="User-friendly tutorial">
๐ Quick Start
</a>
</div>
</article>
<article class="faq-item">
<h3>Can I run AI models completely offline?</h3>
<p>Yes, GGUF models can run entirely offline on your local machine without requiring internet
connectivity.</p>
<div class="faq-links">
<a href="docs/installation/" class="more-info-btn" aria-label="Offline installation guide">
๐ Offline Setup
</a>
<a href="docs/quick-start/#test-basic-chat" class="more-info-btn"
aria-label="Start using offline AI">
๐ฌ Start Chatting
</a>
</div>
</article>
<article class="faq-item">
<h3>What are the system requirements?</h3>
<p>GGUF models can run on standard hardware. Smaller models work on systems with 8GB RAM, while
larger models may require 16GB or more.</p>
<div class="faq-links">
<a href="gguf-memory-calculator.html" class="more-info-btn"
aria-label="RAM calculator for GGUF models">
๐งฎ Memory Calculator
</a>
<a href="docs/installation/#gpu-acceleration-optional" class="more-info-btn"
aria-label="GPU acceleration setup">
โก GPU Setup
</a>
</div>
</article>
<article class="faq-item">
<h3>How do I get started?</h3>
<p>Getting started is simple: install GGUF Loader, download a GGUF model, load it in the app,
and start chatting with AI locally.</p>
<div class="faq-links">
<a href="how-to-run-gguf-models.html" class="more-info-btn"
aria-label="Step-by-step guide to run GGUF models">
๐ฏ How to Run GGUF
</a>
<a href="docs/installation/" class="more-info-btn" aria-label="Installation instructions">
๐ฆ Installation
</a>
</div>
</article>
<article class="faq-item">
<h3>Can I create custom addons?</h3>
<p>Yes! GGUF Loader has a powerful addon system that lets you create custom functionality with
Python. Build your own AI tools and integrations.</p>
<div class="faq-links">
<a href="docs/addon-development/" class="more-info-btn"
aria-label="Learn to develop addons">
๐ ๏ธ Development Guide
</a>
<a href="docs/addon-api/" class="more-info-btn" aria-label="Complete API reference">
๐ API Reference
</a>
</div>
</article>
<article class="faq-item">
<h3>What is the Smart Floating Assistant?</h3>
<p>The Smart Floating Assistant is a revolutionary feature that lets you process text with AI
across all applications. Select text anywhere and get instant AI assistance.</p>
<div class="faq-links">
<a href="docs/smart-floater-example/" class="more-info-btn"
aria-label="Learn about Smart Floating Assistant">
โจ Smart Assistant Guide
</a>
<a href="docs/quick-start/#use-the-smart-floating-assistant" class="more-info-btn"
aria-label="Quick start with floating assistant">
๐ Get Started
</a>
</div>
</article>
<article class="faq-item">
<h3>Where can I find GGUF models?</h3>
<p>You can download GGUF models from Hugging Face (especially TheBloke's optimized models), or
convert your own models to GGUF format.</p>
<div class="faq-links">
<a href="download-gguf-models.html" class="more-info-btn"
aria-label="Download GGUF models guide">
๐ฅ Download Guide
</a>
<a href="2025-07-07-top-10-gguf-models-i5-16gb.html" class="more-info-btn"
aria-label="Best GGUF models for your system">
๐ Best Models
</a>
</div>
</article>
</div>
</section>
</main>
<footer>
<p>© 2025 GGUF Loader. All rights reserved.</p>
</footer>
<!-- Mobile Menu Script -->
<script src="mobile-menu.js" defer></script>
</body>
</html>