The global defaults are openai. If you want to change the defaults, you'll need to install the llm and embedding model class you want to use, and either change the defaults, or pass in the llm/embed model explicitly
For example
pip install llama-index-llms-ollama llama-index-embeddings-ollama
To change global defaults
from llama_index.llms.ollama import Ollama
from llama_index.embeddings.ollama import OllamaEmbedding
from llama_index.core import Settings
Settings.llm = Ollama(...)
Settings.embed_model = OllamaEmbedding(...)
To override global defaults locally
index = VectorStoreIndex(..., embed_model=embed_model)
query_engine = index.as_query_engine(..., llm=llm)