Home » Server Options » Streams & AQ » streams efficiency - question (11.1.0.6.0)
streams efficiency - question [message #551670] Fri, 20 April 2012 01:20 Go to next message
piotrtal
Messages: 168
Registered: June 2011
Location: Poland/Czestochowa
Senior Member
Hi.

My streams enviroment works pefect between two 11g isntances, but my concern is about its efficiency.
I think i have configured my enviroment well. Used streams advisors for pools - even give more than needed.

streams pool about 300M on source and destination.

aq_tm_processes=1

and now is my concerns:
My test is only for test purposes so please don't yell at me.
For test purposes I am replicating one table with 15000 rows. First column is datetime and is refresed every 5 minutes with sysdate.
After 20-30 minutes my streams replication hold and capture process have 'Paused for Flow Control' status.

v$buffered_publishers give info that 'IN FLOW CONTROL: TOO MANY UNBROWSED MESSAGES'.

after bouncing db on db side, everything starts work properly and data capture process scans logs and push all content throught propagation to destination tables.

this situation happens alvays. now I am updating only 1000 rows with sysdate - will see what happen.

my question. it is possible that streams environments won't be abble properly propagate 15000 records (commited after last one)?
i was sure that it won't be any poblems with this and now... such surprise.

thanks in advance.
Re: streams efficiency - question [message #551680 is a reply to message #551670] Fri, 20 April 2012 01:53 Go to previous messageGo to next message
Michel Cadot
Messages: 68624
Registered: March 2007
Location: Nanterre, France, http://...
Senior Member
Account Moderator
Yes it is possible.
Streams is very inefficient.
It propagates one log record per updated rows (NOT per statement) and so update in the target the rows one by one.

Regards
Michel
Re: streams efficiency - question [message #551690 is a reply to message #551680] Fri, 20 April 2012 02:04 Go to previous messageGo to next message
John Watson
Messages: 8922
Registered: January 2010
Location: Global Village
Senior Member
Unlike some people, I think the Streams is a pretty good product. But it does need careful setup and can be horribly complicated. Have you gone through note 746247.1, which is the starting point for this sort of issue?
Re: streams efficiency - question [message #551702 is a reply to message #551690] Fri, 20 April 2012 04:06 Go to previous messageGo to next message
piotrtal
Messages: 168
Registered: June 2011
Location: Poland/Czestochowa
Senior Member
John Watson wrote on Fri, 20 April 2012 09:04
Unlike some people, I think the Streams is a pretty good product. But it does need careful setup and can be horribly complicated. Have you gone through note 746247.1, which is the starting point for this sort of issue?


no, but thanks John. i will read it.

i narrowed down the amount of records which are updated periodically to 1000 - the same situation. streams was working for about 30 minutes and stopped with the same symptomps like above.
now i have 10 records updated per transaction every 5 minutes - still work.

but nevermind. i suspected that 1000 rows per transation won't be to much for streams enviromnent. but it suprprised me not well.
Re: streams efficiency - question [message #551717 is a reply to message #551702] Fri, 20 April 2012 04:54 Go to previous messageGo to next message
Michel Cadot
Messages: 68624
Registered: March 2007
Location: Nanterre, France, http://...
Senior Member
Account Moderator
Streams is good in OLTP environment (even with complex changes) when you insert/update/delete rows one by one NOT for bulk modifications.

Regards
Michel
Re: streams efficiency - question [message #551718 is a reply to message #551717] Fri, 20 April 2012 05:01 Go to previous messageGo to next message
piotrtal
Messages: 168
Registered: June 2011
Location: Poland/Czestochowa
Senior Member
Michel Cadot wrote on Fri, 20 April 2012 11:54
Streams is good in OLTP environment (even with complex changes) when you insert/update/delete rows one by one NOT for bulk modifications.

Regards
Michel


thans very surprising me because I thought that streams is perfect for example for ETL processes, which are mostly consistent of very hudge transactions to be propagated. i read this in one of the technical book. but maybe book was not proper. Smile

update:
at this moment i am updating 10 records per 5 seconds - everythink works perfectly.
Re: streams efficiency - question [message #551759 is a reply to message #551718] Fri, 20 April 2012 08:14 Go to previous messageGo to next message
piotrtal
Messages: 168
Registered: June 2011
Location: Poland/Czestochowa
Senior Member
update:

now i am doing update 15000 rows on column with datetime (like in the first my post) but with one transaction per one record.
EVERYHING WORKS FINE!
the problem is and was with large transactions (over 1000 records per one transactions) - it supposed to be bottleneck for streams.
Re: streams efficiency - question [message #551828 is a reply to message #551759] Sat, 21 April 2012 01:35 Go to previous messageGo to next message
piotrtal
Messages: 168
Registered: June 2011
Location: Poland/Czestochowa
Senior Member
still working Wink

but back to my doubts that streams is an solution for replication warehouse data. i read this in "Oracle 11g R1R2 Real Application Clusters Essentials"

===================
Streams is not a true disaster recovery solution or high availability option, but more
of a complementary solution to enhance the availability options provided by Oracle
Data Guard and Oracle RAC technologies. One of the most common ways to use this
technology is with large Oracle data warehouses and data marts to replicate a subset
of the source data to another environment for testing and verification purposes.
=================

and as i know warehouses consist of tons of data DMLs.
but this description is a little confusing for me because of my problems with "large" transactions (15000 dml per transaction) in my streams enviroment.
Re: streams efficiency - question [message #551829 is a reply to message #551828] Sat, 21 April 2012 01:41 Go to previous messageGo to next message
Michel Cadot
Messages: 68624
Registered: March 2007
Location: Nanterre, France, http://...
Senior Member
Account Moderator
Genuine datawarehouse databases are filled by OLTP ones, there are tons of DMLs but single row ones, no problem for Streams in this case.
Ot can support large transactions as long as each statement is single row one.
The problem is not on transaction size, it is on DML one.

Regards
Michel
Re: streams efficiency - question [message #551831 is a reply to message #551829] Sat, 21 April 2012 02:11 Go to previous messageGo to next message
piotrtal
Messages: 168
Registered: June 2011
Location: Poland/Czestochowa
Senior Member
Michel Cadot wrote on Sat, 21 April 2012 08:41

...
Ot can support large transactions as long as each statement is single row one.
...


thats an answer for my problems - thanks. it looks like this was causing problem.

but what will you do if (for example) 99% of your DML is "single row" but once a week, there is one big update touching (lets say) 200.000 rows of the table which needs to be replicated througt streams. and you have no posibility to modify this problematic query because this query is included into software which is no longer supported by your vendor. No one is able to rewrite parts of it's code.

streams probably fails with this case.

we could do triggers on this table and propagate this changes to another table using "single row update" which could be propagated further instead of the first one. it could be one solution.
... but propatagion of this first one table is not possible usign streams?
Re: streams efficiency - question [message #551832 is a reply to message #551831] Sat, 21 April 2012 02:17 Go to previous messageGo to next message
John Watson
Messages: 8922
Registered: January 2010
Location: Global Village
Senior Member
Hello again. You seem to be hung up on the idea that "Streams doesn't work for a large update statement". But your only evidence is that you had a problem with flow control, which you have not tried to fix. There are many articles on metalink regarding how to tune Streams for this.
Re: streams efficiency - question [message #551833 is a reply to message #551832] Sat, 21 April 2012 02:30 Go to previous messageGo to next message
piotrtal
Messages: 168
Registered: June 2011
Location: Poland/Czestochowa
Senior Member
John Watson wrote on Sat, 21 April 2012 09:17
Hello again. You seem to be hung up on the idea that "Streams doesn't work for a large update statement". But your only evidence is that you had a problem with flow control, which you have not tried to fix. There are many articles on metalink regarding how to tune Streams for this.


i will do again some research and try correction of my problems with 'flow control' in the next week.

i tryed use advisors, read articles, tune my streams but nothing helped so far, so i suspected that maybe streams is not able deal with 15000 multi row updates.

... so starting this discussion i wanted to know your opinion that maybe this problem is not possible to overcome. if there is still chance for me to force streams propagate 15000 multi row updates per one transaction - i wil try it next week again.
Re: streams efficiency - question [message #551834 is a reply to message #551832] Sat, 21 April 2012 02:31 Go to previous messageGo to next message
Michel Cadot
Messages: 68624
Registered: March 2007
Location: Nanterre, France, http://...
Senior Member
Account Moderator
I did it in real in several clients (in car and finance entreprises), and I can say that it indeed does not work with large DML; even the best Oracle experts could not tune Streams to make it work. Every week flows are on error with a backlog of 2 days per day (it lasts 2 days to apply one day of redo records). (Without speaking about the increasing load in the source database that penalized the source clients and which was partly solved using a remote capture process in the next version.)
So no, it does not work with large DML in real world.
Know what, the client chooses to use Sybase Replication Server instead and now investigates Golden Gate.

Regards
Michel
Re: streams efficiency - question [message #551836 is a reply to message #551834] Sat, 21 April 2012 03:25 Go to previous message
John Watson
Messages: 8922
Registered: January 2010
Location: Global Village
Senior Member
OK, fair enough. I don't like it when options are thrown out on purely anecdotal evidence, but clearly you put real research into it.
Previous Topic: Streams replication from source and not from target
Next Topic: Tools for creating an ODS
Goto Forum:
  


Current Time: Thu Mar 28 07:03:43 CDT 2024