Results 1 to 37 of 37

Thread: timer loses one second every tick!

  1. #1

    Thread Starter
    Hyperactive Member
    Join Date
    Aug 2016
    Posts
    279

    timer loses one second every tick!

    Hi I;ve been using the timer control and when I set it to 7000 = 7 seconds interval, every time it ticks, it loses on second,
    i havent use any code to reduce the second every tick, I just use msgbox and it seems every tick it loses one second therefore it doesnt meet the requirement of the interval of the timer. Please help

  2. #2
    Super Moderator jmcilhinney's Avatar
    Join Date
    May 2005
    Location
    Sydney, Australia
    Posts
    110,344

    Re: timer loses one second every tick!

    What exactly do you mean by "it loses one second"? Are you saying that the Interval becomes 600 instead of 7000, because that's the only thing that would actually make sense. Most likely your expectations aren't appropriate.

    Instead of using a message box, which will block the UI thread and may cause events to be handled later than you expect, try using Console.WriteLine to output the current time. You can then keep an eye on the Output window in the IDE and see whether the gaps between the times displayed are an close match for your Interval. The differences will not be exactly 7000 milliseconds but, if you don't have anything else blocking the UI thread for any significant length of time, they should be pretty close.

  3. #3

    Thread Starter
    Hyperactive Member
    Join Date
    Aug 2016
    Posts
    279

    Re: timer loses one second every tick!

    Quote Originally Posted by jmcilhinney View Post
    What exactly do you mean by "it loses one second"? Are you saying that the Interval becomes 600 instead of 7000, because that's the only thing that would actually make sense. Most likely your expectations aren't appropriate.

    Instead of using a message box, which will block the UI thread and may cause events to be handled later than you expect, try using Console.WriteLine to output the current time. You can then keep an eye on the Output window in the IDE and see whether the gaps between the times displayed are an close match for your Interval. The differences will not be exactly 7000 milliseconds but, if you don't have anything else blocking the UI thread for any significant length of time, they should be pretty close.
    here is the sample code
    Code:
     Private Sub Timer1_Tick(sender As System.Object, e As System.EventArgs) Handles Timer1.Tick
    
            TextBox1.Text = "ON COM"
        End Sub
    
        Private Sub Form1_Load(sender As System.Object, e As System.EventArgs) Handles MyBase.Load
            Timer1.Enabled = True
            Timer2.Enabled = True
    
        End Sub
    
        Private Sub Timer2_Tick(sender As System.Object, e As System.EventArgs) Handles Timer2.Tick
            TextBox1.Text = CStr(x)
            x += 1
            If x = 8 Then
                x = 1
            End If
        End Sub
    End Class
    1000 = 1 second right? I used textbox to output the timer counter

  4. #4

    Thread Starter
    Hyperactive Member
    Join Date
    Aug 2016
    Posts
    279

    Re: timer loses one second every tick!

    Quote Originally Posted by jmcilhinney View Post
    What exactly do you mean by "it loses one second"? Are you saying that the Interval becomes 600 instead of 7000, because that's the only thing that would actually make sense. Most likely your expectations aren't appropriate.

    Instead of using a message box, which will block the UI thread and may cause events to be handled later than you expect, try using Console.WriteLine to output the current time. You can then keep an eye on the Output window in the IDE and see whether the gaps between the times displayed are an close match for your Interval. The differences will not be exactly 7000 milliseconds but, if you don't have anything else blocking the UI thread for any significant length of time, they should be pretty close.
    yes it becomes 6000

  5. #5

    Thread Starter
    Hyperactive Member
    Join Date
    Aug 2016
    Posts
    279

    Re: timer loses one second every tick!

    Quote Originally Posted by jmcilhinney View Post
    What.
    I set the timer1 to 7000 interval and the timer2 to 1000 interval so everytime the timer2 ticks it adds 1 to the x variable and output it to the textbox. idk why the timer1 tick sometimes ticks to 14000 or less but more then 7000 when it starts on form load sometimes

  6. #6
    Super Moderator jmcilhinney's Avatar
    Join Date
    May 2005
    Location
    Sydney, Australia
    Posts
    110,344

    Re: timer loses one second every tick!

    Quote Originally Posted by earvinnill View Post
    yes it becomes 6000
    Does it really? Did you check the value of the property and it was 6000? If so then something on your system is broken. Did you actually do as I instructed and test using Console.WriteLine to display the current time? If so then what were the results? In not then I'll be back when you have.

  7. #7

    Thread Starter
    Hyperactive Member
    Join Date
    Aug 2016
    Posts
    279

    Re: timer loses one second every tick!

    Quote Originally Posted by jmcilhinney View Post
    Does it really? Did you check the value of the property and it was 6000? If so then something on your system is broken. Did you actually do as I instructed and test using Console.WriteLine to display the current time? If so then what were the results? In not then I'll be back when you have.


    but im using winforms. it does not change to absolute 6000. i think it has delays or something. but im sure it is accurate counting to 7 seconds butin the long run, it changed to something

  8. #8
    Super Moderator jmcilhinney's Avatar
    Join Date
    May 2005
    Location
    Sydney, Australia
    Posts
    110,344

    Re: timer loses one second every tick!

    Quote Originally Posted by earvinnill View Post
    but im using winforms. it does not change to absolute 6000. i think it has delays or something. but im sure it is accurate counting to 7 seconds butin the long run, it changed to something
    Apparently you are refusing to do as I've suggested so you're wasting my time. I'm done.

  9. #9

    Thread Starter
    Hyperactive Member
    Join Date
    Aug 2016
    Posts
    279

    Re: timer loses one second every tick!

    Quote Originally Posted by jmcilhinney View Post
    Apparently you are refusing to do as I've suggested so you're wasting my time. I'm done.
    i output the current time it seems that the timer adds 1 second in the long run and back to 7 sec interval

  10. #10
    Frenzied Member
    Join Date
    Dec 2014
    Location
    VB6 dinosaur land
    Posts
    1,191

    Re: timer loses one second every tick!

    Ignore Timer2 for now since it seems Timer1 is what you have an issue with. As JLM suggested, you could do something like this to see that it fires every 7 seconds (plus a few milliseconds typically because of how Windows handles messages). If it isn't consistent then you are doing something that is causing the UI thread to hang, thus, preventing messages from being handled in a timely manner.

    Code:
        Private Sub Timer1_Tick(sender As Object, e As EventArgs) Handles Timer1.Tick
    
            Console.WriteLine(DateTime.Now.Second.ToString & "." & DateTime.Now.Millisecond.ToString)
    
        End Sub

  11. #11
    You don't want to know.
    Join Date
    Aug 2010
    Posts
    4,578

    Re: timer loses one second every tick!

    The Timer control, despite its name, is not designed to be 100% accurate. A gap of a full second is very odd, but some accumulated error is definitely expected.

    For the event to fire, a message has to go into your application's message queue. Then, your application has to be idle enough to process that message. Then, your code executes.

    If your program does anything that ties up the message queue, then you'll see some significant drift in timer ticks. If your computer's CMOS clock is on the fritz, you might see it tick faster. That's harder to detect than it used to be since most computers sync with atomic clocks over the internet now, but in the Bad Old Days it wasn't uncommon to have computers that lost 5-10 minutes daily.

  12. #12
    Sinecure devotee
    Join Date
    Aug 2013
    Location
    Southern Tier NY
    Posts
    6,582

    Re: timer loses one second every tick!

    I can't tell from your example code in post #3 quite what your test case is.
    As a rule, it isn't good to use multiple timers anyway. You should use one timer at some higher frequency and keep track of the ticks and divide out the work based on what relative frequency you want different things to run.
    If you want more accuracy on average over time, you'll need to check the expected elapsed time against a good clock and make adjustments.

    Using the timer for timekeeping is not its purpose, and it is expected that it will loose time compared to clock time because the timer will not fire (tick) until the interval you have selected has past. So at best, it could possibly tick on the interval you selected once in awhile, but the vast majority of the time will fire some number of milliseconds after the interval you've chosen.

    In the majority of the cases I've seen over the last six years or so, the WinForm timer uses a 64hz clock as its trigger so will fire at a multiple of that 64hz rate (15.625 ms interval). So if you pick 15ms or less, it will run at 64hz. If you set 10ms so were expecting 100hz then your expectation of 700 ticks being 7 seconds will really take (700/64 = around 10.9 seconds), so you would say you're loosing 3.9 seconds per 7 seconds.

    I assume you're doing something similar (though I can't tell just what), with the two timer intervals and the combination of difference is giving you the perceived 1 second loss per 7 seconds.

    If you want to see what the expected true loss should be using a single timer that is set to fire at 7000ms interval on your machine you can try running this test example code in a new project. (add a timer control to the project).
    Code:
    Option Strict On
    Public Class Form1
      Dim sw As New Stopwatch
      Dim tbx As New TextBox
    
      Private Sub Form1_Load(sender As System.Object, e As System.EventArgs) Handles MyBase.Load
        Controls.Add(tbx)
        tbx.Size = New Size(400, 200)
        tbx.Multiline = True
        tbx.ScrollBars = ScrollBars.Vertical
        tbx.Visible = True
        tbx.BringToFront()
        Timer1.Interval = 7000
        sw.Start()
        Timer1.Start()
      End Sub
    
      Private Sub Timer1_Tick(sender As System.Object, e As System.EventArgs) Handles Timer1.Tick
        Static tickCount As Long
        tickCount += 1
    
        Dim elapsedMs As Long = sw.ElapsedMilliseconds
        Dim expectedVal As Long = tickCount * 7000
        Dim diff As Integer = CInt(elapsedMs - expectedVal)
     
        tbx.SelectedText =
          String.Format("Expected Value: {0}, StopWatch Reports: {1}, difference: {2} ms {3}",
                        expectedVal, elapsedMs.ToString, diff.ToString, vbNewLine)  'I can never remember the .net version of newline
      End Sub
    End Class
    On my machine, it tends to drift about 4ms per cycle so would take around 250 cycles to loose a second, so that comes out to be loosing around 2 seconds per hour (250 cycles to loose 1 second * 7 seconds per cycle = 1750 seconds to loose one second, 1750/60 = 29.16 minutes to loose one second).

    Now, if you want to make an adjustment to your tick interval in order to compensate so that you don't drift forever in one direction, but tick short occasionally to bring your average back to 7000, then add the following line after the diff calculation.
    Code:
    '...
        Dim diff As Integer = CInt(elapsedMs - expectedVal)
      
       Timer1.Interval = 7000 - diff
    '...
    If you add that line, the Timer1.Interval will be shorted by the diff.
    It won't actually have an effect until the combination of tick time and Interval time cross that 15.625ms clock tick time, so you will see the diff continue to grow, but it should never exceed 16 ms without being pulled back so that you average your desired interval over time (16ms is assuming your machine is like my machine and the timer is using a 64hz clock for its trigger).

    Of course, being the tick event is tied in with other GUI events, there can be other things that can push the event out quite a bit on any given tick, so having a way of compensating the interval to bring the running average back to where you want is necessary if you're counting on X number of ticks per Y amount of time (and don't mind the "jitter" in the actual tick interval.
    Last edited by passel; Oct 19th, 2016 at 01:33 PM.

  13. #13

    Thread Starter
    Hyperactive Member
    Join Date
    Aug 2016
    Posts
    279

    Re: timer loses one second every tick!

    Quote Originally Posted by passel View Post
    I can't tell from.
    You example code is a bit similar to the code ive posted. How can I make it accurate? I apply the timer interval - diff in the timer tick event and it still seems delay. I'm confused about the ms youre talking about in the timer control. But then I realized that the timer control is not really accurate. Thank you for making me see the delay in time in the timer tick.

  14. #14

    Thread Starter
    Hyperactive Member
    Join Date
    Aug 2016
    Posts
    279

    Re: timer loses one second every tick!

    Quote Originally Posted by Sitten Spynne View Post
    The Timer control, despite its name, is not designed to be 100% accurate. A gap of a full second is very odd, but some accumulated error is definitely expected.

    For the event to fire, a message has to go into your application's message queue. Then, your application has to be idle enough to process that message. Then, your code executes.

    If your program does anything that ties up the message queue, then you'll see some significant drift in timer ticks. If your computer's CMOS clock is on the fritz, you might see it tick faster. That's harder to detect than it used to be since most computers sync with atomic clocks over the internet now, but in the Bad Old Days it wasn't uncommon to have computers that lost 5-10 minutes daily.
    That's what I saw in the code of passel. I thought there was something wrong in my code but then the timer control isn't really accurate.

  15. #15
    Powered By Medtronic dbasnett's Avatar
    Join Date
    Dec 2007
    Location
    Jefferson City, MO
    Posts
    9,764

    Re: timer loses one second every tick!

    It is not clear what you are doing, but the points that sitten and passel have made about windows forms timer accuracy is correct. The trick is to get off of the UI except when needed. In the following code I have used two buttons, a label, and an async method to illustrate a seven second countdown timer. Maybe it will give you some ideas.

    Code:
    Public Class Form1
    
        'note Async keyword on handler
        Private Async Sub Button1_Click(sender As Object, e As EventArgs) Handles Button1.Click
            Button1.Enabled = False
            Label1.Text = ""
            Dim tsk7 As Task
            tsk7 = Task.Run(Sub()
                                SevenSecondCountDown()
                            End Sub)
            Await tsk7
            Button1.Enabled = True
        End Sub
    
        Private Sub Button2_Click(sender As Object, e As EventArgs) Handles Button2.Click
            'used to test UI response while '7 sec. timer' is running
            Label1.Text = DateTime.Now.ToString
        End Sub
    
        Private Sub SevenSecondCountDown()
            Dim sec7 As New TimeSpan(0, 0, 7)
            Dim stpw As New Stopwatch
            stpw.Start()
    
            Do
                Me.BeginInvoke(Sub()
                                   'update UI with progress
                                   Label1.Text = stpw.Elapsed.ToString
                               End Sub)
                Threading.Thread.Sleep(100)
            Loop While stpw.Elapsed < sec7
    
            Me.BeginInvoke(Sub()
                               Label1.Text = stpw.Elapsed.ToString
                           End Sub)
        End Sub
    End Class
    Last edited by dbasnett; Oct 20th, 2016 at 09:17 PM.
    My First Computer -- Documentation Link (RT?M) -- Using the Debugger -- Prime Number Sieve
    Counting Bits -- Subnet Calculator -- UI Guidelines -- >> SerialPort Answer <<

    "Those who use Application.DoEvents have no idea what it does and those who know what it does never use it." John Wein

  16. #16
    PowerPoster SJWhiteley's Avatar
    Join Date
    Feb 2009
    Location
    South of the Mason-Dixon Line
    Posts
    2,256

    Re: timer loses one second every tick!

    Quote Originally Posted by earvinnill View Post
    here is the sample code
    Code:
     Private Sub Timer1_Tick(sender As System.Object, e As System.EventArgs) Handles Timer1.Tick
    
            TextBox1.Text = "ON COM"
        End Sub
    
        Private Sub Form1_Load(sender As System.Object, e As System.EventArgs) Handles MyBase.Load
            Timer1.Enabled = True
            Timer2.Enabled = True
    
        End Sub
    
        Private Sub Timer2_Tick(sender As System.Object, e As System.EventArgs) Handles Timer2.Tick
            TextBox1.Text = CStr(x)
            x += 1
            If x = 8 Then
                x = 1
            End If
        End Sub
    End Class


    1000 = 1 second right? I used textbox to output the timer counter
    This isn't really timing the timer. This is like timing runners over a mile with the minute hand of a clock.

    Quote Originally Posted by earvinnill View Post
    ...I thought there was something wrong in my code but then the timer control isn't really accurate.
    Incorrect. The windows forms timer is accurate - there is something wrong with your code. It's not losing a second at all.

    If you need a 7 second interval - as long as you are not blocking the user interface (as others have noted) - then the timer tick event will be raised every 7 seconds. It won't be exactly 7 seconds, to an infinite precision, but it will be well within 100 milliseconds. Most likely plenty suitable for any application.

    Use a single timer and check the time in that tick event.

    If you need precision, then you need another mechanism: again, a forms timer can work, you just have to 'tick' a bit faster. You will only be as precise on each tick as the internals of the timer, though (as others have noted).

    On my computer windows environment, using DateTime.Now, a 7 second tic is 'off' about 4 milliseconds. Each tick is off around this amount. Over a period of time, these inaccuracies will accumulate, so you may only get 9 ticks in a 70.0 second interval.
    Last edited by SJWhiteley; Oct 20th, 2016 at 07:35 AM.
    "Ok, my response to that is pending a Google search" - Bucky Katt.
    "There are two types of people in the world: Those who can extrapolate from incomplete data sets." - Unk.
    "Before you can 'think outside the box' you need to understand where the box is."

  17. #17
    Sinecure devotee
    Join Date
    Aug 2013
    Location
    Southern Tier NY
    Posts
    6,582

    Re: timer loses one second every tick!

    Quote Originally Posted by SJWhiteley View Post
    ...
    On my computer windows environment, using DateTime.Now, a 7 second tic is 'off' about 4 milliseconds. Each tick is off around this amount. Over a period of time, these inaccuracies will accumulate, so you may only get 9 ticks in a 70.0 second interval.
    Which validates what I said, and should be shown in the textbox of my example code.

    Also, as my example code shows, if you modify the Interval of the timer using the difference between the time you wanted the tick to fire, and the time it actually fired, you can have it fire every 7 seconds, within 16 ms normally for as long as you run.

    But, as SJWhiteley has noted, and I mentioned, you can do things at particular points in the GUI to cause a greater than 16ms delay in the tick. For instance, if you click and hold on the titlebar without moving the mouse, you can cause a delay in the tick firing of around 400ms.
    As, SJWhiteley has also mentioned about using a non-GUI thread for timing, for interfaces where a timer ticking late by more than 40ms can cause an interface to flag a problem, I definitely use a background thread to do the timing and updating of the interface.
    But even then (I note any "slips" of the timer being more than 20ms from the time it was desired to fire), I get a report now and again of the tick being late, even using a background thread.
    Windows is not a real-time system, so you have to live with some jitter, or burn some CPU to have a background processing loop that runs continuously.

    TizzyT reports that he has a highly accurate Timer class, that doesn't burn a lot of CPU processing, but I haven't been able to check it out yet to see how it works as I work with VB2010, so need to get on a machine that has VB2012 or later (or the 2010 has been updated to support the Async task mechanism used in the code). I have been tied up for several days so haven't been able to.
    I did look at the code a bit to see if I could figure out how to manually adapted it back to a un-updated 2010 tasking mechanism, but decided it would take more time than I could allow, so will just wait until I get access to a machine with a later version of VS.

  18. #18

    Thread Starter
    Hyperactive Member
    Join Date
    Aug 2016
    Posts
    279

    Re: timer loses one second every tick!

    Quote Originally Posted by SJWhiteley View Post
    This isn't really timing the timer.
    Thank you for your explanation SJ. So I am messing with the gui that is why the timer is a bit off. How can I run it to the background thread?

  19. #19

    Thread Starter
    Hyperactive Member
    Join Date
    Aug 2016
    Posts
    279

    Re: timer loses one second every tick!

    Quote Originally Posted by passel View Post
    Which validates what I said
    Why does the timer control like that?

  20. #20
    Super Moderator jmcilhinney's Avatar
    Join Date
    May 2005
    Location
    Sydney, Australia
    Posts
    110,344

    Re: timer loses one second every tick!

    Quote Originally Posted by earvinnill View Post
    Why does the timer control like that?
    There is no timer control. The System.Windows.Forms.Timer class does not inherit the System.Windows.Forms.Control class, either directly or indirectly, so it is not a control. The fact that it can be added in the designer does not make it a control. All a type has to do to be able to be used in the designer is implement the IComponent interface, which is usually done by ingeriting the Component class, which Timer does.

    The fact is that a WinForms Timer will raise its Tick event pretty closely to whatever you set the Interval to. Whether that event gets handled immediately that it's raised is dependent on how busy the UI thread is, as is always the case with methods executed on the UI thread.

    If you need to execute code on the UI thread then you're going to end up in this boat regardless. Even if you don't use a WinForms Timer to raise a Tick event, you're still going to have to marshal a call to the UI thread and you'll still have to wait if it's busy.

    If you don't need to execute your code on the UI thread then don't. You can use a Timers.Timer to raise its Elapsed event on a secondary thread. Just make sure that the SynchronizingObject property is set to Nothing or it will act like a WinForms Timer. Obviously you can't make any changes to the UI in that Elapsed event handler though.

    For future reference, maybe get to what you're actually trying to achieve a bit sooner, rather than how you're trying to achieve it. If we know what it is you want to do then we can go straight to how best to achieve it, instead of worrying about why what you're doing doesn't achieve some unknown aim.

  21. #21

    Thread Starter
    Hyperactive Member
    Join Date
    Aug 2016
    Posts
    279

    Re: timer loses one second every tick!

    Quote Originally Posted by jmcilhinney View Post
    There is no timer control..
    What I've been really achieving is that there is a 3 in 1 measuring instrument where it is automatically moving to another measuring instrument and then there is an interval on when it is going to measure but there is a catch, if the material does not meet the requirements, the machine will stop and the user needed to remove it to the machine manually and then press again the start button on the machine. I've assessed the machine and I have this kind of problem. First, I don't think that the time interval will be accurate for these machine because, Even I perfected the timing of it, I'm not sure what is the perfect timing for the measuring of the robot to the material, I can get a little bit off measurement, what if the machine is still calibrating the measurement of the material then I can be wrong if i set it to this time. My colleague said that it is very efficient if there is an external device (PLC) which stores all the data from the machine and Ill just get the data from the machine so that will be the easy way to do it. Howerver, I don't think they are going to buy that device. I'm just testing on getting t he data from the instrument but it is still not my main project. I'm still new to my job because I'm a fresh graduate, that is why your opinons are very helpful to me. Thank you JM for always lecturing me. Haha. I've gained knowledge from the first topic I've posted here.

  22. #22
    PowerPoster
    Join Date
    Feb 2012
    Location
    West Virginia
    Posts
    14,205

    Re: timer loses one second every tick!

    How are you getting the data from the instrument? Com, IP ??

    One method I have used in the past for timing is to set a timer interval of a lower value, say 200 ms for example then in the timer event check the system time. No matter how many seconds or minutes or hours you want it to fire you can get within 10-20 ms using this method provided you don't have long running code stopping it from happening.

    That said normally you would have an event driven routine for receiving data and not need a timer, the data would simply be received when sent. You may want to have a timer for a timeout condition.

    Also when dealing with something like this two way communication is the way to go.

  23. #23

    Thread Starter
    Hyperactive Member
    Join Date
    Aug 2016
    Posts
    279

    Re: timer loses one second every tick!

    Quote Originally Posted by DataMiser View Post
    How are you getting the data from the instrument? Com, IP ??

    through comport, that is why I need a plc that stores data so the data will be accurate.

  24. #24
    Member
    Join Date
    Oct 2016
    Posts
    32

    Re: timer loses one second every tick!

    The Built in Timers aren't accurate if you need more accuracy you can try using the multimedia timer which is accurate to about 1ms, or if you need even more accuracy you can try my MicroTimer.

    My MicroTimer: http://pastebin.com/NnvJQCmK
    Download: http://www.mediafire.com/file/21xkad...MicroTimer.dll

    Usage:
    Code:
    Private Rate as Double = 1.0 / 7.0 'The rate specifies how many times per second (similar to setting the interval in a timer to 7000 or every 7 seconds)
    
    Public WithEvents Tmr As New MicroTimer(Rate)
    
    Private Sub Tmr_Elapsed() Handles Tmr.Elapsed
        ' Some code here
    End Sub
    Please comment on any improvements you make.
    Last edited by TizzyT; Oct 21st, 2016 at 05:37 AM.

  25. #25
    Powered By Medtronic dbasnett's Avatar
    Join Date
    Dec 2007
    Location
    Jefferson City, MO
    Posts
    9,764

    Re: timer loses one second every tick!

    What are the comport settings? Is the device poll / respond or can it send information without being polled? What is the smallest and largest block of information transmitted?
    My First Computer -- Documentation Link (RT?M) -- Using the Debugger -- Prime Number Sieve
    Counting Bits -- Subnet Calculator -- UI Guidelines -- >> SerialPort Answer <<

    "Those who use Application.DoEvents have no idea what it does and those who know what it does never use it." John Wein

  26. #26
    Frenzied Member
    Join Date
    Dec 2014
    Location
    VB6 dinosaur land
    Posts
    1,191

    Re: timer loses one second every tick!

    Quote Originally Posted by earvinnill View Post
    My colleague said that it is very efficient if there is an external device (PLC) which stores all the data from the machine and Ill just get the data from the machine so that will be the easy way to do it.
    Your program could do the same thing.

    what if the machine is still calibrating the measurement of the material
    It seems the real issue for you is knowing where the machine is in its measuring process. If you stored all the data points (or read them from this other device you don't think they will buy), how would you know which one is correct? Are there no digital outputs from the machine (or its controlling PLC) you could read with a DIO card?

  27. #27

    Thread Starter
    Hyperactive Member
    Join Date
    Aug 2016
    Posts
    279

    Re: timer loses one second every tick!

    Quote Originally Posted by topshot View Post
    Your program could do the same thing.



    It seems the real issue for you is knowing where the machine is in its measuring process. If you stored all the data points (or read them from this other device you don't think they will buy), how would you know which one is correct? Are there no digital outputs from the machine (or its controlling PLC) you could read with a DIO card?

    That device will know that it is correct. Trust them. Haha. They also did that from the last plc I've connected to. And there is an address that will be given to me by that device supplier if they are going to buy those. There is a large possibility that they need to buy that device because creating the program without that device will be so complicated plus the machine interval can be changed depends upon the request of the user.

  28. #28
    You don't want to know.
    Join Date
    Aug 2010
    Posts
    4,578

    Re: timer loses one second every tick!

    That sounds goofy enough that I'm pretty sure I misinterpreted it?

    If the device takes three measurements and periodically switches between "modes", surely it does something to signal when it's changing? Otherwise, what happens in the case where something goes wrong on the computer and the program has to be restarted? It'd be a nightmare if you had to stop a production line to restart all the equipment every time that happened. If it really behaved this way, I'd nope my way towards a better piece of hardware.

    And if it does have a way for you to determine which "mode" it's in, you don't need a timer. Just listen for/poll for that.

  29. #29
    Frenzied Member
    Join Date
    Dec 2014
    Location
    VB6 dinosaur land
    Posts
    1,191

    Re: timer loses one second every tick!

    Quote Originally Posted by earvinnill View Post
    That device will know that it is correct. Trust them. Haha. They also did that from the last plc I've connected to. And there is an address that will be given to me by that device supplier if they are going to buy those. There is a large possibility that they need to buy that device because creating the program without that device will be so complicated plus the machine interval can be changed depends upon the request of the user.
    If this proposed device can know the value is right so can you unless your measurement machine has some proprietary protocol that only this device can access for some reason. There is either something within the data or other commands/DIO bits that allows this device to determine the correct value. You need to figure out what that is.

  30. #30

    Thread Starter
    Hyperactive Member
    Join Date
    Aug 2016
    Posts
    279

    Re: timer loses one second every tick!

    Quote Originally Posted by topshot View Post
    If this proposed device can know the value is right so can you unless your measurement machine has some proprietary protocol that only this device can access for some reason. There is either something within the data or other commands/DIO bits that allows this device to determine the correct value. You need to figure out what that is.
    Yes. The supplier will tell me what I needed to do to receive the data from it.

  31. #31

    Thread Starter
    Hyperactive Member
    Join Date
    Aug 2016
    Posts
    279

    Re: timer loses one second every tick!

    Quote Originally Posted by topshot View Post
    If this proposed device can know the value is right so can you unless your measurement machine has some proprietary protocol that only this device can access for some reason. There is either something within the data or other commands/DIO bits that allows this device to determine the correct value. You need to figure out what that is.
    that would be a yes, I've asked the timer why does it delay because I was in the testing state and assessing it, if it can be done just by

  32. #32
    Powered By Medtronic dbasnett's Avatar
    Join Date
    Dec 2007
    Location
    Jefferson City, MO
    Posts
    9,764

    Re: timer loses one second every tick!

    sitten and I asked some questions that you did not answer. Would you please take a moment and do so.
    My First Computer -- Documentation Link (RT?M) -- Using the Debugger -- Prime Number Sieve
    Counting Bits -- Subnet Calculator -- UI Guidelines -- >> SerialPort Answer <<

    "Those who use Application.DoEvents have no idea what it does and those who know what it does never use it." John Wein

  33. #33

    Thread Starter
    Hyperactive Member
    Join Date
    Aug 2016
    Posts
    279

    Re: timer loses one second every tick!

    Quote Originally Posted by dbasnett View Post
    What are the comport settings? Is the device poll / respond or can it send information without being polled? What is the smallest and largest block of information transmitted?
    Oh Sorry! I forgot to reply. The supplier will declare the comport settings of it and I will be requesting the it must send info it it has new material that has been measured. what do you mean by the smallest and largest block of information transmitted?

  34. #34

    Thread Starter
    Hyperactive Member
    Join Date
    Aug 2016
    Posts
    279

    Re: timer loses one second every tick!

    Quote Originally Posted by Sitten Spynne View Post
    That sounds goofy enough that I'm pretty sure I misinterpreted it?
    Yes, I don't need a timer for it, I need a machine that will prompt that it has new data stored in the machine so that time, I will be going to retrieve the data from it and about when something goes wrong on the computer. Maybe that will be the delimitation of it? Or is there a way how am I going to do that. Anw, I'm no plc programmer so I don't know anything about the machines. I'm just asking what I need to retrieve the data from the machine.

  35. #35
    Powered By Medtronic dbasnett's Avatar
    Join Date
    Dec 2007
    Location
    Jefferson City, MO
    Posts
    9,764

    Re: timer loses one second every tick!

    Quote Originally Posted by earvinnill View Post
    Oh Sorry! I forgot to reply. The supplier will declare the comport settings of it and I will be requesting the it must send info it it has new material that has been measured. what do you mean by the smallest and largest block of information transmitted?
    I am asking about the device you are using. You've lost me.
    My First Computer -- Documentation Link (RT?M) -- Using the Debugger -- Prime Number Sieve
    Counting Bits -- Subnet Calculator -- UI Guidelines -- >> SerialPort Answer <<

    "Those who use Application.DoEvents have no idea what it does and those who know what it does never use it." John Wein

  36. #36

    Thread Starter
    Hyperactive Member
    Join Date
    Aug 2016
    Posts
    279

    Re: timer loses one second every tick!

    Quote Originally Posted by dbasnett View Post
    I am asking about the device you are using. You've lost me.
    keyence

  37. #37
    Powered By Medtronic dbasnett's Avatar
    Join Date
    Dec 2007
    Location
    Jefferson City, MO
    Posts
    9,764

    Re: timer loses one second every tick!

    Quote Originally Posted by earvinnill View Post
    keyence
    They have a lot of products. Is there a specific machine you are working with and can you share that information? A link to the specific product would be ideal.
    My First Computer -- Documentation Link (RT?M) -- Using the Debugger -- Prime Number Sieve
    Counting Bits -- Subnet Calculator -- UI Guidelines -- >> SerialPort Answer <<

    "Those who use Application.DoEvents have no idea what it does and those who know what it does never use it." John Wein

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  



Click Here to Expand Forum to Full Width