It's UWAweek 19

help2002

This forum is provided to promote discussion amongst students enrolled in CITS2002 Systems Programming.
Please consider offering answers and suggestions to help other students! And if you fix a problem by following a suggestion here, it would be great if other interested students could see a short "Great, fixed it!"  followup message.
Displaying selected article
Showing 1 of 1168 articles.
Currently 21 other people reading this forum.


SVG not supported

Login to reply

👍?
helpful
1:55pm Tue 12th Sep, ANONYMOUS

I'm up to doing the calculation part of my algorithm. Just by logic, I would assume that a data bus will only occur when a process executes a "reading" or "writing" system-call, since a data-bus is omly used to transfer information through specific and different locations, such as sending information into the hard disk or terminal. And from that, I would not think a data-bus will be acquired whilst the rest of the system calls are executed, especially during the exeuction of "wait" and "exit". So, is my logical assumption correct, or does a data-bus acquirement happen throughout every system-call execution presented on the project sheet (i.e. the acquiring of a data-bus does not proceed with its action through the executions of just "read" and "write", but with all of the others as well - "spawn", "sleep", "wait" and "exit")? I also want to clarify my suspicions. So, I asked ChatGPT about whether a data-bus can either only be used once throughout a whole individual command, or within that 1 command, it can occur multiple times. The AI told me that it is permittable to happen multiple times, to the limit for however many system-calls are utilised within a single command (whether it's only used for just "read" or "write" executions, or all 6 system-calls). Is that correct, or am I being misinformed here? Just one more thing... afterwards, I also asked ChatGPT if I have to add the 20-microsecond delay, for acquiring the data-bus, into the total time result (in microseconds) separately. It proceeded to respond with that the delay time is fused into the execution times themselves. So, for example, instead of doing "total time = 80 + 20 = 100 microseconds" (and adding the 20-microsecond delay separately) I have to do "total time = 80 microseconds" (where the 20-microsecond delay is fused with the full 80-microsecond execution time-limit, i.e. the data-bus acquirement occurs simultaneously with the execution of the process). Is the AI correct here, or am I also being misinformed here? Could someone, or Chris, please clarify these informations for me. Once I understand this, it will make the creation of all calculation parts of the algorithm easier. Thank you very much!

The University of Western Australia

Computer Science and Software Engineering

CRICOS Code: 00126G
Written by [email protected]
Powered by history
Feedback always welcome - it makes our software better!
Last modified  8:08AM Aug 25 2024
Privacy policy