Join the 80,000 other DTN customers who enjoy the fastest, most reliable data available. There is no better value than DTN!

(Move your cursor to this area to pause scrolling)




"I used to have *******, but they are way more money for the same thing. I have had no probs with data from DTN since switching over." - Comment from Public Forum Post
"Everything is working amazing now. I'm already impressed with the true-tick feed of IQFeed and it's ability to support my 480 symbol layout." - Comment from Tyler via Email
"I just wanted to tell you what a fine job you have been doing. While *******, from what I hear, has been down and out, off and on, IQ feed has held like a champ this week." - Comment from Shirin
"If you are serious about your trading I would not rely on IB data for serious daytrading. Took me a while to justify the cost of IQ Feed and in the end, it's just a 2 point stop on ES. Better safe than sorry" - Comment from Public Forum
"I've been using Neoticker RT with IQFeed for two months, and I'm very happy with both of the products (I've had IQFeed for two years with very few complaints). The service from both companies is exceptional." - Comment from Public Forum
"For anyone considering using DTN.IQ for a data feed, my experience with the quality of data and the tech support has been very positive." - Comment from Public Forum
"This is an excellent value, the system is generous (allowing for 500 stocks) and stable (and really is tick-by-tick), and the support is fantastic." - Comment from Shirin via Email
"Boy, probably spent a thousand hours trying to get ******* API to work right. And now two hours to have something running with IQFeed. Hmmm, guess I was pretty stupid to fight rather than switch all this time. And have gotten more customer service from you guys already than total from them… in five years." - Comment from Jim
"I am a hedge fund manager here. It’s funny, I have a Bloomberg terminal and a Bridge feed, but I still like having my DTN feed!" - Comment from Feras
"Thank God for your Data Feed as the only Zippers I see are on my pants (LOL), and no more 200 pip spikes to mess up charts." - Comment from Spiro via Email
Home  Search  Register  Login  Recent Posts

Information on DTN's Industries:
DTN Oil & Gas | DTN Trading | DTN Agriculture | DTN Weather
Follow DTNMarkets on Twitter
DTN.IQ/IQFeed on Twitter
DTN News and Analysis on Twitter
»Forums Index »NEW IQFEED FORUMS »New IQFeed Forum »iqconnect Out of Memory
Author Topic: iqconnect Out of Memory (5 messages, Page 1 of 1)

beauchamp117
-Interested User-
Posts: 1
Joined: May 25, 2018

EMH


Posted: Oct 18, 2023 03:14 PM          Msg. 1 of 5
Running out of Memory when looking at 400 equities level 1 or 76 futures contracts level 2. I am using the Python library which works great. I am having trouble determining what this means. It looks like my data feed stays up but this notification is concerning.



File Attached: Screenshot 2023-10-18 161301.png (downloaded 314 times)

mkvalor
-Interested User-
Posts: 26
Joined: Oct 6, 2020

Keep your tools sharp.


Posted: Feb 23, 2024 01:58 AM          Msg. 2 of 5
I'm a user like yourself, but I've seen a few posts in the past dealing with memory issue problems.

One thing: do you happen to know if your app makes a separate connection to IQConnect.exe for each symbol? Especially using python, I could see this adding up to a lot of CPU and memory overhead. Even though this strategy can be more convenient (because each program only has to deal with data for a single symbol), it can be a lot more memory and CPU efficient to open multiple symbol watches (say, 50 at a time) on a single connection to IQConnect and then use logic in your applicatoin to send each line to the right place based on the ticker symbol at the beginning (such as to a separate data file, to a data stream queue like Kafka, or to some other logic handler to calculate moving averages, etc).

For example, you could use a python dict where the key is the IQF symbol name and the value is an open file handle for that symbol (if you don't find it in the dict, you create a new file and then add it there) -- then check the start of each line to look up the right file to write it to. This is more complex because of the logic I mentioned but also because you'd need to do something like read a config file per process with the list of symbols you want to watch per process. If you got fancy, you could have a single file with all symbols, one per line, and then start the program by adding command line params that tell it which line to start at in the symbols file and how many of them to read for that particular process.

Anyway -- enough advice based on guessing. If you're already reading multiple symbols per process, then it must be something else.



-Mark D. Valor
Edited by mkvalor on Feb 23, 2024 at 02:01 AM
Edited by mkvalor on Feb 23, 2024 at 02:03 AM

DTN_Gary_Stephen
-DTN Guru-
Posts: 396
Joined: Jul 3, 2019


Posted: Feb 23, 2024 04:29 PM          Msg. 3 of 5
It is always advisable to keep the number of connection to a minimum, and keep them open rather than frequently closing/reopening. IQFeed will allow you to collect to make multiple connections to the same socket from the same machine, and this has its uses. (Making 2/4 connections, and dividing the workload among them, can help overall speed in handling a large number of symbols.) But having a very large number of connections, or frequently disconnecting and reconnecting, are counterproductive and may contribute to this kind of situation happening.

Sincerely,
Gary Stephen
DTN IQFeed Implementation Support Specialist

trulyunstable
-Interested User-
Posts: 1
Joined: Apr 4, 2024


Posted: Apr 4, 2024 03:37 AM          Msg. 4 of 5
I'm wondering if keeping it on all the time will drain the battery?

hersanchez
-Interested User-
Posts: 1
Joined: May 1, 2024


Posted: Yesterday @ 10:42 PM          Msg. 5 of 5
It sounds like your system is encountering memory issues when processing a large amount of data for 400 equities at level 1 or 76 futures contracts at level 2. This notification indicates that your system's memory resources may be insufficient to handle the volume of data you're trying to process. It would be advisable to review your system's memory allocation and consider optimizing your code to handle large datasets more efficiently.
 

 

Time: Thu May 2, 2024 3:01 AM CFBB v1.2.0 11 ms.
© AderSoftware 2002-2003