Control and Query your Tesla vehicles (via Tessie.com)

i have checked mine and have not seen one error.. i query 2 queries every 15 minutes.

so i would say some network issues on your side.. or have you checked that you set tessie to use the NEW API, tesla just finally got rid of the old legacy api and you still could be trying to use that under tessie.. they just sent out an email regarding that.

Yeah that was my first thought, that Tessie was somehow still using the legacy api, but it is set to next gen. Plus direct control from the Tessie interface was working, only failing via the Hubitat app. Might just be a glitch of some sort, I'll restart everything and monitor, thanks.

Still happening. This was the first command request of the day. Normal status queries are all operating correctly. Any ideas?

app:1732024-06-16 10:35:55.594 AMerrorRequest failed for /5YJSB7S10FFXXXXX/command/start_charging, [error:Request rate limited by Tesla.] - groovyx.net.http.HttpResponseException: status code: 429, reason phrase: Too Many Requests!

app:1732024-06-16 10:35:55.592 AMdebugIn general error handler case resp = null groovyx.net.http.HttpResponseException: status code: 429, reason phrase: Too Many Requests

app:173authorizedHttpRequestWithTimeout GET /5YJSB7S10FFXXXXXX/command/start_charging try: 1

app:1732024-06-16 10:35:54.730 AMinfoexecuteApiCommand

app:1732024-06-16 10:35:54.679 AMerrorRequest failed for /5YJSB7S10FFXXXXXX/command/set_charging_amps?amps=7, [error:Request rate limited by Tesla.] - groovyx.net.http.HttpResponseException: status code: 429, reason phrase: Too Many Requests!

app:1732024-06-16 10:35:54.659 AMdebugIn general error handler case resp = null groovyx.net.http.HttpResponseException: status code: 429, reason phrase: Too Many Requests

app:1732024-06-16 10:35:53.164 AMinfoauthorizedHttpRequestWithTimeout GET /5YJSB7S10FFXXXXXX/command/set_charging_amps?amps=7 try: 1

How often r u sending requests
Maybe ask tessie as that error is coming from them.

Command requests normally twice a day before this. The failure above was the first request of the day (i've been setting up a rule to control charging which will send additional commands, but the first failure of this kind happened before I even enabled that).

Is your understanding that the Tessie API has request limits? I would have thought they would be on a commercial Tesla API plan that would be unlimited, at least at a practical request level?

Tried another request just before (via the Hubitat device) and that succeeded without error, so this seems semi random. Will see if I can get any info out of Tessie but I'm expecting they might say they don't support third party integrations! :slight_smile:

i just tried changing charging level and start charging on my car just in case there was a weird error i made in the api calls, but they all worked ok, although the refresh didn't show the charge_limit_Soc (chargelevel internal) change till a few refreshes later.

in all i must have issued 40 commands in the space of 10 minutes and got no errors.

Thanks for testing. Yeah I've definitely had the commands succeed but then start throwing this error again later so never seemed like there was a bug in your integration. I just saw on the Tessie site an article about 'too many request' errors in the Tesla app itself, one cause of which can supposedly be poor wifi (presumably the car sending too many requests because of lost responses or something). If that's true I might switch the car to cellular only in the garage and see if that has an effect. Still seems like poor resilience in the Tesla API design if that can happen though.

the wifi in the car (at least my model 3) is utter crap.. i had to put a separate router in the garage for the tesla literally 10 feet away.. i believe the antenna is in the passenger mirror.

2 Likes

Yeah I've done similar in the past. The connection isn't great but good enough for most things normally. But if it can potentially cause this problem I'll have to improve it or stop using it.

Feedback from Tessie is that this is a result of the enforcement by Tesla since June 14 of the heavy rate limitation of the existing (and only) 'Discovery' API plan. Doesn't quite make sense in terms of exactly what I'm seeing, but regardless those Tesla limits will break what I'm trying to do at some point (and also break services like ChargeHQ that use the Fleet API). Hopefully Tesla announce sensible plans that commercial clients such as Tessie can subscribe to and maintain usability for their end users and third party integrators. Pretty poor on Tesla's part.

1 Like

Ya doesnt make sense everyone including me should be seeing that then.

1 Like

They directed me to this discussion. Sounds like it's rolling out across the fleet in stages perhaps, so not all users might yet be seeing the limits: API returns "Too many requests" (429) · Issue #255 · teslamotors/vehicle-command · GitHub

It's 3 x more expensive now per month!

1 Like

I went with the lifetime purchase myself. I really, really hate subscriptions…

I just got this notification as well, although under my account subscription there is no indication that the price is changing so not sure what that means.

Regardless, this will render all the work I've done recently moot, as I won't be paying 3 times as much for a convenience subscription that was marginally worthwhile in the first place. Starts to make the effort required to implement a direct solution to Tesla's API more attractive I imagine. Not happy.

Doing a quick search Teslemetry looks interesting. They might be about to hike things unreasonably as well of course, but as it stands it provides the API access (without all the unnecessary Tessie extras) at a much more reasonable price.

Have you looked at any other potential integration options for your app Larry (@kahn-hubitat)? I would expect porting to a different 3rd party API from the Tessie one would be a relatively straightforward task unless their wrappers are massively different?

In order to ensure I charge my Tesla using only excess solar energy, I make a substantial number of updates to the charging amperage throughout the day. I am very concerned that this may have an impact on this, not allowing me to update the values as needed…

I understand that Tesla solar does this automatically, but I had asked for a few quotes from them and never got a response, so ended-up going with a local installer. The link therefore had to be made through Hubitat to allow for these to be synched.

I haven’t been impacted yet, but will be very concerned when I will be… I really wish there was a local API that would allow for local updates to charging amperage… It is concerning when companies only provide cloud APIs and then complain that they are having to pay for infrastructure and are impacted by the volume of data…

Yeah I'm in the same boat. It's annoying that I have a locally wifi-connected vehicle 10m away and yet I have to go through paid cloud APIs to do basic operations like charge and climate control. Direct access to the API in the vehicle itself (which clearly would exist) would be so much easier.

Tesla's Charge On Solar feature would provide some of what I do, but is inexplicably not available (at least yet) on legacy S and X models even though the vehicles clearly support everything needed to do it.

First world problems I know, but unless things change it's looking like I'll lose automation abilities that I've had for years as the convenience will not be worth the cost.

2 Likes

Has anyone else seen this issue or error code. I havent and think their excuse may be bs and something else is going on.

Haven’t looked for it… since everything is working as expected… but haven’t seen it either.

1 Like

I agree that a local interface would be a huge benefit, but I work with what I'm given.

FWIW, my adjustment program reads data for two cars from the server every 13 seconds, and writes (adjust amperage) every 30 seconds as needed. I don't think you need to worry. :slight_smile:

2 Likes