SA question
I have burned a few EPROMS for my 91 Vette but I do not quite understand how the ECM handles the SA. As I understand I should make a first WOT round so the ECM will do its SA adjustment. Apparently the ECM advances the SA until knock occurs and then it will subtract a few degrees. Exactly how does the ECM do this?
When I take my new chip for a spin, when will the ECM do the adjustment? Is it during WOT, after WOT, at idle or perhaps when the ignition is turned off (during shut down)? Does it apply a ramp or a step to the SA? Is this adjustment done every time I drive my car? Does it make several corrections or just one?
Robert
First time I try a new chip the SA reported by the ECM on the ALDL connector is very close to the values in main SA table. They will not differ as long as the engine is in open loop.
Once the engine enters closed loop the ECM starts making adjustments to the SA tables. It adds and subtracts the entire tables by some constant. I drive my car for 10 minutes and I notice that the ECM both add and subtract several times. Adjustment is not only during WOT but seems to occur all the time.
When you have a new chip that you want to try perhaps one should run it for more values than 100 kPa before you evaluate your new chip. Would it not be a good idea to run the entire MAP range a few times before making the proper tests of the new chip?
So, is there someone out there who could tell me if I have misunderstood something?
Robert
Thanks for your reply spankyellow.
Resently I started to use Datamaster which seems OK to me. First I used 3.4.1 but it does have some bugs (eg, AF ratio is not correct) so I'm currently running 3.4.17 which looks much better. I had no installation problems at all (I'm running Windows 2000).
I have burned a few EPROMS for my 91 Vette but I do not quite understand how the ECM handles the SA. As I understand I should make a first WOT round so the ECM will do its SA adjustment. Apparently the ECM advances the SA until knock occurs and then it will subtract a few degrees. Exactly how does the ECM do this?
QUOTE]
Since you ask:
The $8D mask for a 727 ECM (if that is what you are using) has no 'learned' spark
advance behavior. The base SA tables are what the engine runs after start-up.
Period. The ECM cannot not 'adjust' the tables: that is only done using a program
editor to re-write the table entries.
There are a few modifiers applied to the base timing - mainly minor step-changes
which are temperature-dependent (cold start-up), and adjustments made by the
knock-retard routines. There is also the idle hysteresis described here (part of the
fueling control):
http://forums.corvetteforum.com/showthread.php?t=556467
The ECM can also conduct a diagnostic check of the KS, but that is done only on a
fully warmed engine at cruise speed (IIRC) and has no impact on the base tables.
It is strictly a momentary phenomenon.
All of the spark behavior is defined in the base program, including the idle spark,
idle hysteresis, PE spark advance, spark latency, stall saver, coolant temp, and
a few other tables I've probably forgotten. None are likely to cause the SA to vary
much from the base tables once the engine is warm.
From the information provided, one could only hypothesize that SA changes you report
seeing might result from:
1. Faulty sensors or wiring.
2. ECM fault or EMI.
3. Problems with the cable/scan tool.
4. Other
HTH
I have burned a few EPROMS for my 91 Vette but I do not quite understand how the ECM handles the SA. As I understand I should make a first WOT round so the ECM will do its SA adjustment. Apparently the ECM advances the SA until knock occurs and then it will subtract a few degrees. Exactly how does the ECM do this?
Since you ask:
The $8D mask for a 727 ECM (if that is what you are using) has no 'learned' spark
advance behavior. The base SA tables are what the engine runs after start-up.
Period. The ECM can not 'adjust' the tables: that is only done using a program
editor to re-write the table entries.
There are a few modifiers applied to the base timing - mainly minor step-changes
which are temperature-dependent (cold start-up), and adjustments made by the
knock-retard routines. There is also the idle hysteresis described here (part of the
fueling control):
http://forums.corvetteforum.com/showthread.php?t=556467
The ECM can also conduct a diagnostic check of the KS, but that is done only on a
fully warmed engine at cruise speed (IIRC) and has no impact on the base tables.
It is strictly a momentary phenomenon.
All of the spark behavior is defined in the base program, including the idle spark,
idle hysteresis, PE spark advance, spark latency, stall saver, coolant temp, and
a few other tables I've probably forgotten. None are likely to cause the SA to vary
much from the base tables once the engine is warm.
From the information provided, one could only hypothesize that SA changes you report
seeing might result from:
1. Faulty sensors or wiring.
2. ECM fault or EMI.
3. Problems with the cable/scan tool.
HTH
Last edited by DOCTOR J; Nov 5, 2004 at 05:01 AM.
The Best of Corvette for Corvette Enthusiasts
I don't think there is anything wrong with the SA I see using my scanner. The problem is that I try to calculate total SA from the tables in the EPROM and compare it to what I see using my scanner. It does not quite agree (up to 1 deg deviation so it is fairly small though) so I was wondering why.
From what you have written I see that I have overlooked the temperature dependence. Although engine "fully" warm the engine temperature varies, I'll have another look at my data to see if the temperature variation was large enough to alter SA.
I've noticed the "spark scatter" at idle as your link showed. I found those tables and they explain the 5 deg "spikes" I see at idle. I have also noticed a smaller variation, 1-2 deg, in idle SA... could that be due to mechanical tolerances in the distributor? I can't find a table with such small correction so I thought it might have a mechanical origin?
Thanks for the information
sensors, then doing a 3D table look-up. The SA from the table is what the ECM sends
to the ignition module, AND reports to the ALDL bus. Since the 'commanded' SA and the
'ALDL' SA are the same number, it would be impossible for them to be different by any
amount.
Narrative information on how GM does a 3D table look-up is here:
http://forums.corvetteforum.com/showthread.php?t=547494
The actual sub-routines for iterating different tables and putting register values on the
ALDL bus are spelled-out in the $8D hack text. Seems to me the iterations are effectively
linear; the MAP values are 8-bit and DRP values are 16?-bit (but I'd need to go look to
be 100% certain). RPM is reported in 8-bit form, and spark degrees - I don't remember -
8-bit I think.
If the scanner timing is different from the SA table timing, at steady engine speed and
coolant temp, I would look for:
1. Faults in the sensor wiring & grounding.
2. EMI from spark plug wire routing & shielding.
If you are seeing differences from the SA table values during engine transients, I would
expect you need to take into account the different data rates of the timing control loop
(~80 updates/sec) and the ALDL rate (~7 updates/sec).
Distributor lash (mechanical slop) certainly adds to jitter in the data. Though I haven't
experienced one that throws the timing out by 1 or 2 * at steady-state conditions
myself, it is possible. I'd probably look to the MAP values first, and see if they were
jumping - the 8-bit values there lend themselves to greater output variations.
There are also a couple of spots where the RPM calculation goes through some MSB
manipulation (over the engine's RPM range) in the ECM. However those seem to be
only minor discontinuities in the data stream, and are limited to only two spots on the
RPM data logs.
Thanks for the link, very impressive. It will require a lot of thinking before I understand everything in that thread.
One other thing. The SA tables are "total" SA, you do not add inital timing to it. The SA table is what you see with you scanner, not SA table + initial timing. Suppose my initial timing is set to 6 deg and my distributor is also set to 6 deg. If I make a new chip with 4 deg initial timing and also turn the distributor to 4 deg, then I should not notice any difference in total SA, right? For example, for a given MAP and RPM there will be no difference in SA as seen by the scanner and the engine will run exactly the same (same amount of knock etc), right? If I change both initial timing and distributor setting by 2 deg will it have any affect at all?
But engine speed varies 600 to 625 rpm at idle, a simple interpolation results in a 1.23 deg SA variation.
you would have seen that on overspeed, the ECM subtracts 5* of timing for the first 50 RPM over the
idle set point. Since the table interpolation is linear, the ECM would be subtracting ~1* of spark for
each 10 RPM overspeed. IIRC, one RPM bit at idle = 12.5 RPM, so ~1.25* would make sense as the
smallest spark difference that would be reported by the scan prog. I'd further guess that the scan
s/w is probably set up to report RPMs in increments of 25 - hence it lacks enough resolution to 'see'
smaller changes that the ECM may be running. That's MHO, anyway.
fron the 'Closed TPS Spark Advance' table - and the idle modifiers.
However if you only show 12.7* of SA at idle it does look on the low side. Stock GM value was
more like 20-25* actual, or physical SA.
amount has no effect on the total SA.
HTH
But engine speed varies 600 to 625 rpm at idle, a simple interpolation results in a 1.23 deg SA variation.
you would have seen that on overspeed, the ECM subtracts 5* of timing for the first 50 RPM over the
idle set point. Since the table interpolation is linear, the ECM would be subtracting ~1* of spark for
each 10 RPM overspeed. IIRC, one RPM bit at idle = 12.5 RPM, so ~1.25* would make sense as the
smallest spark difference that would be reported by the scan prog. I'd further guess that the scan
s/w is probably set up to report RPMs in increments of 25 - hence it lacks enough resolution to 'see'
smaller changes that the ECM may be running. That's MHO, anyway.
fron the 'Closed TPS Spark Advance' table - and the idle modifiers.
However if you only show 12.7* of SA at idle it does look on the low side. Stock GM value was
more like 20-25* actual, or physical SA.
amount has no effect on the total SA.
HTH
more like 20-25* actual, or physical SA.
I still have some knock counts above 65 kPa which only occur in higher gears so it seem my SA tables need some tweak. See
SA main table change for modified L98
Thanks for the help
If you read my second post above, I believe I wrote that the ECM reports RPM with an
8-bit word. You can check the actual $8D hack to be sure, but if I have it right then
6400 RPM/ 256 does NOT give 1-RPM measurement resolution.
Data out have no more accuracy than data in. This is one of those awkward mathematical
rules that applies everywhere (except on internet bulletin boards).
The point of which is: if one plans to analyze data it is helpful to have some idea of where
they originate - otherwise one ends up trying to analyze numbers and trends that do not
exist. Just something to keep in mind.
Re the question about how to time the spark map for a supercharged V-8:
No offense intended, but unless one has some considerable skill and understanding of
engines & control systems, this seems like a very daunting do-it-yourself proposition.
I respectfully suggest the best course might be to take the problem to a technician who
has real, demonstrated experience in that area, as opposed to soliciting feckless opinion
from the internet. For an experienced technician, you might try paging TJWong who posts
on this board. I have no personal experience with his work, but he claims to have a
commercial background in forced-induction engines.
Good luck with your project, whatever you decide to do.
Thanks for the research, I hadn't remembered that one.
Unfortunately the DRP double word is one my edition of Diacom does not seem to display.
I'll have to look at a new version of Datamaster; plotting the high-res signal might be
useful in some situations - such as idle adjustments.
Never the less, the ALDL stream still gives only ~7 frame/ sec updates. I found this rate
inadequate for tuning engine transients. One solution to the problem was outlined here:
http://www.corvetteforum.net/c4/doctorj/carwb.htm
There are some new WB/Logger units available since '01, in more elegant & integrated
packages. However none that I have looked at (so far) provide a 120 sample/ sec data
rate. Having become used to it, I wouldn't try to configure spark and fuel with much less.
EGT is also a useful one that I need to add to the logger someday, but haven't done yet.
Resently I started to use Datamaster which seems OK to me. First I used 3.4.1 but it does have some bugs (eg, AF ratio is not correct) so I'm currently running 3.4.17 which looks much better. I had no installation problems at all (I'm running Windows 2000).
also
inadequate for tuning engine transients. One solution to the problem was outlined here:
http://www.corvetteforum.net/c4/doctorj/carwb.htm
















