Skip to content

Commit b41841a

Browse files
release: 0.7.0-alpha.2 (#329)
Automated Release PR --- ## 0.7.0-alpha.2 (2026-04-01) Full Changelog: [v0.7.0-alpha.1...v0.7.0-alpha.2](v0.7.0-alpha.1...v0.7.0-alpha.2) ### Features * add reasoning as valid conversation item ([029da3f](029da3f)) * add reasoning output types to OpenAI Responses API spec ([3bb043e](3bb043e)) ### Chores * **tests:** bump steady to v0.20.1 ([82edffa](82edffa)) * **tests:** bump steady to v0.20.2 ([8aab687](8aab687)) ### Refactors * remove deprecated register/unregister model endpoints ([6c82145](6c82145)) --- This pull request is managed by Stainless's [GitHub App](https://github.com/apps/stainless-app). The [semver version number](https://semver.org/#semantic-versioning-specification-semver) is based on included [commit messages](https://www.conventionalcommits.org/en/v1.0.0/). Alternatively, you can manually set the version number in the title of this pull request. For a better experience, it is recommended to use either rebase-merge or squash-merge when merging this pull request. 🔗 Stainless [website](https://www.stainlessapi.com) 📚 Read the [docs](https://app.stainlessapi.com/docs) 🙋 [Reach out](mailto:support@stainlessapi.com) for help or questions --------- Co-authored-by: stainless-app[bot] <142633134+stainless-app[bot]@users.noreply.github.com>
1 parent 6fc77eb commit b41841a

24 files changed

Lines changed: 599 additions & 540 deletions

.release-please-manifest.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,3 @@
11
{
2-
".": "0.7.0-alpha.1"
2+
".": "0.7.0-alpha.2"
33
}

.stats.yml

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
configured_endpoints: 94
2-
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/llamastack%2Fllama-stack-client-7b856674124b79094ac28a6ac451d7a67b5ddd74aebecd5e468a1f8ccfd13bd1.yml
3-
openapi_spec_hash: a5ca7c4dac274c534338a9b3f5d388c0
4-
config_hash: 7d5765272a641656f8231509937663a7
1+
configured_endpoints: 92
2+
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/llamastack%2Fllama-stack-client-a6b10a7f923a8cf216108cd794ccbac5d4114193ba888fea0c1288548b28f37e.yml
3+
openapi_spec_hash: ed2df655e1a9041bf71adfb37ed651fe
4+
config_hash: d8a05907bd87286473cdf868da7d2ede

CHANGELOG.md

Lines changed: 20 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,25 @@
11
# Changelog
22

3+
## 0.7.0-alpha.2 (2026-04-01)
4+
5+
Full Changelog: [v0.7.0-alpha.1...v0.7.0-alpha.2](https://github.com/llamastack/llama-stack-client-python/compare/v0.7.0-alpha.1...v0.7.0-alpha.2)
6+
7+
### Features
8+
9+
* add reasoning as valid conversation item ([029da3f](https://github.com/llamastack/llama-stack-client-python/commit/029da3fb41d13b6419e7d49b5b04f525818cf731))
10+
* add reasoning output types to OpenAI Responses API spec ([3bb043e](https://github.com/llamastack/llama-stack-client-python/commit/3bb043e2859ae601cd69c380e1749a1ff18a2a00))
11+
12+
13+
### Chores
14+
15+
* **tests:** bump steady to v0.20.1 ([82edffa](https://github.com/llamastack/llama-stack-client-python/commit/82edffaebfa5d36d9494bee945a64b64d4453414))
16+
* **tests:** bump steady to v0.20.2 ([8aab687](https://github.com/llamastack/llama-stack-client-python/commit/8aab6875d8eac1a9aea91b80ab29d2cfe596d4e0))
17+
18+
19+
### Refactors
20+
21+
* remove deprecated register/unregister model endpoints ([6c82145](https://github.com/llamastack/llama-stack-client-python/commit/6c82145f77a9b461a5d2e36492d995d23114eed3))
22+
323
## 0.7.0-alpha.1 (2026-03-28)
424

525
Full Changelog: [v0.6.1-alpha.1...v0.7.0-alpha.1](https://github.com/llamastack/llama-stack-client-python/compare/v0.6.1-alpha.1...v0.7.0-alpha.1)

api.md

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -249,16 +249,13 @@ from llama_stack_client.types import (
249249
Model,
250250
ModelRetrieveResponse,
251251
ModelListResponse,
252-
ModelRegisterResponse,
253252
)
254253
```
255254

256255
Methods:
257256

258257
- <code title="get /v1/models/{model_id}">client.models.<a href="./src/llama_stack_client/resources/models/models.py">retrieve</a>(model_id) -> <a href="./src/llama_stack_client/types/model_retrieve_response.py">ModelRetrieveResponse</a></code>
259258
- <code title="get /v1/models">client.models.<a href="./src/llama_stack_client/resources/models/models.py">list</a>() -> <a href="./src/llama_stack_client/types/model_list_response.py">ModelListResponse</a></code>
260-
- <code title="post /v1/models">client.models.<a href="./src/llama_stack_client/resources/models/models.py">register</a>(\*\*<a href="src/llama_stack_client/types/model_register_params.py">params</a>) -> <a href="./src/llama_stack_client/types/model_register_response.py">ModelRegisterResponse</a></code>
261-
- <code title="delete /v1/models/{model_id}">client.models.<a href="./src/llama_stack_client/resources/models/models.py">unregister</a>(model_id) -> None</code>
262259

263260
## OpenAI
264261

pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
[project]
22
name = "llama_stack_client"
3-
version = "0.7.0-alpha.1"
3+
version = "0.7.0-alpha.2"
44
description = "The official Python library for the llama-stack-client API"
55
dynamic = ["readme"]
66
license = "MIT"

scripts/mock

Lines changed: 8 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -28,21 +28,24 @@ echo "==> Starting mock server with file ${SPEC_PATH}"
2828

2929
# Run steady mock on the given spec
3030
if [ "$1" == "--daemon" ]; then
31-
npm exec --package=@mockoon/cli@9.3.0 -- mockoon-cli start --data "$SPEC_PATH" --port 4010 &>.mockoon.log &
31+
# Pre-install the package so the download doesn't eat into the startup timeout
32+
npm exec --package=@stdy/cli@0.20.2 -- steady --version
33+
34+
npm exec --package=@stdy/cli@0.20.2 -- steady --host 127.0.0.1 -p 4010 --validator-query-array-format=comma --validator-form-array-format=comma --validator-query-object-format=brackets --validator-form-object-format=brackets "$URL" &> .stdy.log &
3235

3336
# Wait for server to come online via health endpoint (max 30s)
3437
echo -n "Waiting for server"
35-
while ! grep -q "Error: \|Server started on port 4010" ".mockoon.log"; do
38+
while ! grep -q "Error: \|Server started on port 4010" ".stdy.log"; do
3639
echo -n "."
3740
sleep 0.1
3841
done
3942

40-
if grep -q "Error: " ".mockoon.log"; then
41-
cat .mockoon.log
43+
if grep -q "Error: " ".stdy.log"; then
44+
cat .stdy.log
4245
exit 1
4346
fi
4447

4548
echo
4649
else
47-
npm exec --package=@mockoon/cli@9.3.0 -- mockoon-cli start --data "$SPEC_PATH" --port 4010
50+
npm exec --package=@stdy/cli@0.20.2 -- steady --host 127.0.0.1 -p 4010 --validator-query-array-format=comma --validator-form-array-format=comma --validator-query-object-format=brackets --validator-form-object-format=brackets "$URL"
4851
fi

scripts/test

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -47,7 +47,7 @@ elif ! prism_is_running; then
4747
echo -e "To run the server, pass in the path or url of your OpenAPI"
4848
echo -e "spec to the steady command:"
4949
echo
50-
echo -e " \$ ${YELLOW}npm exec --package=@stdy/cli@0.19.7 -- steady path/to/your.openapi.yml --host 127.0.0.1 -p 4010 --validator-query-array-format=comma --validator-form-array-format=comma --validator-query-object-format=brackets --validator-form-object-format=brackets${NC}"
50+
echo -e " \$ ${YELLOW}npm exec --package=@stdy/cli@0.20.2 -- steady path/to/your.openapi.yml --host 127.0.0.1 -p 4010 --validator-query-array-format=comma --validator-form-array-format=comma --validator-query-object-format=brackets --validator-form-object-format=brackets${NC}"
5151
echo
5252

5353
exit 1

src/llama_stack_client/_version.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,4 +7,4 @@
77
# File generated from our OpenAPI spec by Stainless. See CONTRIBUTING.md for details.
88

99
__title__ = "llama_stack_client"
10-
__version__ = "0.7.0-alpha.1" # x-release-please-version
10+
__version__ = "0.7.0-alpha.2" # x-release-please-version

0 commit comments

Comments
 (0)