You can subscribe to this list here.
| 2004 |
Jan
|
Feb
|
Mar
(57) |
Apr
(103) |
May
(164) |
Jun
(139) |
Jul
(173) |
Aug
(196) |
Sep
(221) |
Oct
(333) |
Nov
(214) |
Dec
(88) |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 2005 |
Jan
(163) |
Feb
(165) |
Mar
(98) |
Apr
(93) |
May
(199) |
Jun
(118) |
Jul
(200) |
Aug
(212) |
Sep
(185) |
Oct
(297) |
Nov
(437) |
Dec
(272) |
| 2006 |
Jan
(542) |
Feb
(329) |
Mar
(267) |
Apr
(332) |
May
(267) |
Jun
(130) |
Jul
(161) |
Aug
(348) |
Sep
(166) |
Oct
(305) |
Nov
(173) |
Dec
(173) |
| 2007 |
Jan
(199) |
Feb
(118) |
Mar
(133) |
Apr
(200) |
May
(208) |
Jun
(146) |
Jul
(198) |
Aug
(146) |
Sep
(187) |
Oct
(182) |
Nov
(181) |
Dec
(83) |
| 2008 |
Jan
(252) |
Feb
(124) |
Mar
(124) |
Apr
(101) |
May
(143) |
Jun
(122) |
Jul
(129) |
Aug
(60) |
Sep
(80) |
Oct
(89) |
Nov
(54) |
Dec
(112) |
| 2009 |
Jan
(88) |
Feb
(145) |
Mar
(105) |
Apr
(164) |
May
(123) |
Jun
(154) |
Jul
(374) |
Aug
(341) |
Sep
(219) |
Oct
(137) |
Nov
(373) |
Dec
(240) |
| 2010 |
Jan
(197) |
Feb
(270) |
Mar
(253) |
Apr
(150) |
May
(102) |
Jun
(51) |
Jul
(300) |
Aug
(512) |
Sep
(254) |
Oct
(258) |
Nov
(288) |
Dec
(143) |
| 2011 |
Jan
(238) |
Feb
(179) |
Mar
(253) |
Apr
(332) |
May
(248) |
Jun
(255) |
Jul
(216) |
Aug
(282) |
Sep
(146) |
Oct
(77) |
Nov
(86) |
Dec
(69) |
| 2012 |
Jan
(172) |
Feb
(234) |
Mar
(229) |
Apr
(101) |
May
(212) |
Jun
(267) |
Jul
(129) |
Aug
(210) |
Sep
(239) |
Oct
(271) |
Nov
(368) |
Dec
(220) |
| 2013 |
Jan
(179) |
Feb
(155) |
Mar
(59) |
Apr
(47) |
May
(99) |
Jun
(158) |
Jul
(185) |
Aug
(16) |
Sep
(16) |
Oct
(7) |
Nov
(20) |
Dec
(12) |
| 2014 |
Jan
(21) |
Feb
(17) |
Mar
(18) |
Apr
(13) |
May
(27) |
Jun
(15) |
Jul
(19) |
Aug
(22) |
Sep
(30) |
Oct
(16) |
Nov
(19) |
Dec
(16) |
| 2015 |
Jan
(14) |
Feb
(24) |
Mar
(33) |
Apr
(41) |
May
(14) |
Jun
(80) |
Jul
(53) |
Aug
(8) |
Sep
(7) |
Oct
(15) |
Nov
(13) |
Dec
(2) |
| 2016 |
Jan
(22) |
Feb
(12) |
Mar
(30) |
Apr
(6) |
May
(33) |
Jun
(16) |
Jul
(8) |
Aug
(20) |
Sep
(12) |
Oct
(18) |
Nov
(12) |
Dec
(11) |
| 2017 |
Jan
(24) |
Feb
(26) |
Mar
(47) |
Apr
(23) |
May
(19) |
Jun
(14) |
Jul
(28) |
Aug
(30) |
Sep
(17) |
Oct
|
Nov
|
Dec
|
| 2019 |
Jan
(1) |
Feb
(73) |
Mar
(90) |
Apr
(42) |
May
(116) |
Jun
(90) |
Jul
(127) |
Aug
(103) |
Sep
(56) |
Oct
(42) |
Nov
(95) |
Dec
(58) |
| 2020 |
Jan
(102) |
Feb
(31) |
Mar
(93) |
Apr
(60) |
May
(57) |
Jun
(45) |
Jul
(29) |
Aug
(32) |
Sep
(44) |
Oct
(86) |
Nov
(51) |
Dec
(71) |
| 2021 |
Jan
(44) |
Feb
(25) |
Mar
(78) |
Apr
(130) |
May
(64) |
Jun
(74) |
Jul
(21) |
Aug
(64) |
Sep
(40) |
Oct
(43) |
Nov
(21) |
Dec
(99) |
| 2022 |
Jan
(154) |
Feb
(64) |
Mar
(45) |
Apr
(95) |
May
(62) |
Jun
(48) |
Jul
(73) |
Aug
(37) |
Sep
(71) |
Oct
(27) |
Nov
(40) |
Dec
(65) |
| 2023 |
Jan
(89) |
Feb
(130) |
Mar
(124) |
Apr
(50) |
May
(93) |
Jun
(46) |
Jul
(45) |
Aug
(68) |
Sep
(62) |
Oct
(71) |
Nov
(108) |
Dec
(82) |
| 2024 |
Jan
(53) |
Feb
(76) |
Mar
(64) |
Apr
(75) |
May
(36) |
Jun
(54) |
Jul
(98) |
Aug
(137) |
Sep
(58) |
Oct
(177) |
Nov
(84) |
Dec
(52) |
| 2025 |
Jan
(70) |
Feb
(53) |
Mar
(72) |
Apr
(47) |
May
(88) |
Jun
(49) |
Jul
(86) |
Aug
(51) |
Sep
(65) |
Oct
(91) |
Nov
(18) |
Dec
|
| S | M | T | W | T | F | S |
|---|---|---|---|---|---|---|
|
|
|
|
|
1
(10) |
2
(9) |
3
(3) |
|
4
(2) |
5
(5) |
6
(3) |
7
(7) |
8
(9) |
9
(5) |
10
(17) |
|
11
(17) |
12
(9) |
13
(9) |
14
(9) |
15
(9) |
16
(10) |
17
(10) |
|
18
(7) |
19
(7) |
20
(8) |
21
(4) |
22
(4) |
23
(7) |
24
|
|
25
|
26
|
27
(6) |
28
(6) |
29
(24) |
30
(44) |
31
(12) |
|
From: Wolfgang M. M. <wol...@us...> - 2005-12-31 18:38:44
|
Update of /cvsroot/exist/eXist-1.0/webapp/sandbox In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv27914/webapp/sandbox Modified Files: sandbox.xql browse.xql Added Files: window_close_grey.gif Log Message: A few visual updates for the sandbox. --- NEW FILE: window_close_grey.gif --- (This appears to be a binary file; contents omitted.) Index: browse.xql =================================================================== RCS file: /cvsroot/exist/eXist-1.0/webapp/sandbox/browse.xql,v retrieving revision 1.2 retrieving revision 1.3 diff -C2 -d -r1.2 -r1.3 *** browse.xql 18 Dec 2005 19:49:22 -0000 1.2 --- browse.xql 31 Dec 2005 18:38:35 -0000 1.3 *************** *** 25,29 **** let $collection := request:request-parameter("collection", ()) return ! <ajax-response root="{replace($collection, '/[^/]*$', '')}"> { ajax:display-collection($collection) --- 25,29 ---- let $collection := request:request-parameter("collection", ()) return ! <ajax-response root="{replace($collection, '/[^/]*$', '', 'mx')}"> { ajax:display-collection($collection) Index: sandbox.xql =================================================================== RCS file: /cvsroot/exist/eXist-1.0/webapp/sandbox/sandbox.xql,v retrieving revision 1.5 retrieving revision 1.6 diff -C2 -d -r1.5 -r1.6 *** sandbox.xql 18 Dec 2005 19:49:22 -0000 1.5 --- sandbox.xql 31 Dec 2005 18:38:35 -0000 1.6 *************** *** 1,6 **** xquery version "1.0"; - declare option exist:serialize "method=xhtml indent=no"; - declare namespace sandbox="http://exist-db.org/xquery/sandbox"; --- 1,4 ---- *************** *** 9,12 **** --- 7,12 ---- import module namespace xdb="http://exist-db.org/xquery/xmldb"; + declare option exist:serialize "method=xhtml indent=no"; + (:~ Points to the location of the xml-highlight.xsl stylesheet stored in the database :) declare variable $sandbox:XML_HIGHLIGHT_STYLE { "/db/sandbox/xml-highlight.xsl" }; *************** *** 183,186 **** --- 183,187 ---- </fieldset> </div> + </div> <div id="save-panel"> <div> *************** *** 191,207 **** <h2>Export results to new document:</h2> ! <p> ! <label for="docname">Document path</label> ! <a href="#" id="export-resource">Click to select</a> ! </p> ! <p> ! <label for="wrapper">Wrapper element</label> ! <input type="text" id="wrapper"/> ! </p> <button type="button" id="export">Export</button> </div> </div> </div> - </div> </form> <div id="query-output"> --- 192,203 ---- <h2>Export results to new document:</h2> ! <label for="docname">Document path</label> ! <a href="#" id="export-resource">Click to select</a> ! <label for="wrapper">Wrapper element</label> ! <input type="text" id="wrapper"/> <button type="button" id="export">Export</button> </div> </div> </div> </form> <div id="query-output"> |
|
From: Wolfgang M. M. <wol...@us...> - 2005-12-31 18:38:44
|
Update of /cvsroot/exist/eXist-1.0/webapp/sandbox/scripts In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv27914/webapp/sandbox/scripts Modified Files: sandbox.js dragdrop.js scriptaculous.js ajax.js controls.js effects.js Log Message: A few visual updates for the sandbox. Index: controls.js =================================================================== RCS file: /cvsroot/exist/eXist-1.0/webapp/sandbox/scripts/controls.js,v retrieving revision 1.3 retrieving revision 1.4 diff -C2 -d -r1.3 -r1.4 *** controls.js 17 Dec 2005 15:31:19 -0000 1.3 --- controls.js 31 Dec 2005 18:38:35 -0000 1.4 *************** *** 222,227 **** return; } ! ! var value = Element.collectTextNodesIgnoreClass(selectedElement, 'informal'); var lastTokenPos = this.findLastToken(); if (lastTokenPos != -1) { --- 222,232 ---- return; } ! var value = ''; ! if (this.options.select) { ! var nodes = document.getElementsByClassName(this.options.select, selectedElement) || []; ! if(nodes.length>0) value = Element.collectTextNodes(nodes[0], this.options.select); ! } else ! value = Element.collectTextNodesIgnoreClass(selectedElement, 'informal'); ! var lastTokenPos = this.findLastToken(); if (lastTokenPos != -1) { *************** *** 449,453 **** --- 454,460 ---- this.options = Object.extend({ + okButton: true, okText: "ok", + cancelLink: true, cancelText: "cancel", savingText: "Saving...", *************** *** 472,475 **** --- 479,483 ---- highlightendcolor: "#FFFFFF", externalControl: null, + submitOnBlur: false, ajaxOptions: {} }, options || {}); *************** *** 537,550 **** } ! okButton = document.createElement("input"); ! okButton.type = "submit"; ! okButton.value = this.options.okText; ! this.form.appendChild(okButton); ! cancelLink = document.createElement("a"); ! cancelLink.href = "#"; ! cancelLink.appendChild(document.createTextNode(this.options.cancelText)); ! cancelLink. ! this.form.appendChild(cancelLink); }, hasHTMLLineBreaks: function(string) { --- 545,562 ---- } ! if (this.options.okButton) { ! okButton = document.createElement("input"); ! okButton.type = "submit"; ! okButton.value = this.options.okText; ! this.form.appendChild(okButton); ! } ! if (this.options.cancelLink) { ! cancelLink = document.createElement("a"); ! cancelLink.href = "#"; ! cancelLink.appendChild(document.createTextNode(this.options.cancelText)); ! cancelLink. ! this.form.appendChild(cancelLink); ! } }, hasHTMLLineBreaks: function(string) { *************** *** 562,569 **** --- 574,584 ---- text = this.getText(); } + + var obj = this; if (this.options.rows == 1 && !this.hasHTMLLineBreaks(text)) { this.options.textarea = false; var textField = document.createElement("input"); + textField.obj = this; textField.type = "text"; textField.name = "value"; *************** *** 572,583 **** --- 587,603 ---- var size = this.options.size || this.options.cols || 0; if (size != 0) textField.size = size; + if (this.options.submitOnBlur) + textField. this.editField = textField; } else { this.options.textarea = true; var textArea = document.createElement("textarea"); + textArea.obj = this; textArea.name = "value"; textArea.value = this.convertHTMLLineBreaks(text); textArea.rows = this.options.rows; textArea.cols = this.options.cols || 40; + if (this.options.submitOnBlur) + textArea. this.editField = textArea; } *************** *** 748,750 **** this.callback(this.element, $F(this.element)); } ! }; \ No newline at end of file --- 768,770 ---- this.callback(this.element, $F(this.element)); } ! }; Index: dragdrop.js =================================================================== RCS file: /cvsroot/exist/eXist-1.0/webapp/sandbox/scripts/dragdrop.js,v retrieving revision 1.3 retrieving revision 1.4 diff -C2 -d -r1.3 -r1.4 *** dragdrop.js 17 Dec 2005 15:31:19 -0000 1.3 --- dragdrop.js 31 Dec 2005 18:38:35 -0000 1.4 *************** *** 147,150 **** --- 147,151 ---- this._lastPointer = null; this.activeDraggable.endDrag(event); + this.activeDraggable = null; }, *************** *** 192,196 **** reverteffect: function(element, top_offset, left_offset) { var dur = Math.sqrt(Math.abs(top_offset^2)+Math.abs(left_offset^2))*0.02; ! element._revert = new Effect.MoveBy(element, -top_offset, -left_offset, {duration:dur}); }, endeffect: function(element) { --- 193,197 ---- reverteffect: function(element, top_offset, left_offset) { var dur = Math.sqrt(Math.abs(top_offset^2)+Math.abs(left_offset^2))*0.02; ! element._revert = new Effect.Move(element, { x: -left_offset, y: -top_offset, duration: dur}); }, endeffect: function(element) { *************** *** 228,233 **** currentDelta: function() { return([ ! parseInt(this.element.style.left || '0'), ! parseInt(this.element.style.top || '0')]); }, --- 229,234 ---- currentDelta: function() { return([ ! parseInt(Element.getStyle(this.element,'left') || '0'), ! parseInt(Element.getStyle(this.element,'top') || '0')]); }, Index: scriptaculous.js =================================================================== RCS file: /cvsroot/exist/eXist-1.0/webapp/sandbox/scripts/scriptaculous.js,v retrieving revision 1.3 retrieving revision 1.4 diff -C2 -d -r1.3 -r1.4 *** scriptaculous.js 17 Dec 2005 15:31:19 -0000 1.3 --- scriptaculous.js 31 Dec 2005 18:38:35 -0000 1.4 *************** *** 21,25 **** var Scriptaculous = { ! Version: '1.5.0', require: function(libraryName) { // inserting via DOM fails in Safari 2.0, so brute force approach --- 21,25 ---- var Scriptaculous = { ! Version: '1.5.1', require: function(libraryName) { // inserting via DOM fails in Safari 2.0, so brute force approach *************** *** 31,46 **** Prototype.Version.split(".")[1]) < 1.4) throw("script.aculo.us requires the Prototype JavaScript framework >= 1.4.0"); ! var scriptTags = document.getElementsByTagName("script"); ! for(var i=0;i<scriptTags.length;i++) { ! if(scriptTags[i].src && scriptTags[i].src.match(/scriptaculous\.js(\?.*)?$/)) { ! var path = scriptTags[i].src.replace(/scriptaculous\.js(\?.*)?$/,''); ! this.require(path + 'builder.js'); ! this.require(path + 'effects.js'); ! this.require(path + 'dragdrop.js'); ! this.require(path + 'controls.js'); ! this.require(path + 'slider.js'); ! break; ! } ! } } } --- 31,43 ---- Prototype.Version.split(".")[1]) < 1.4) throw("script.aculo.us requires the Prototype JavaScript framework >= 1.4.0"); ! ! $A(document.getElementsByTagName("script")).findAll( function(s) { ! return (s.src && s.src.match(/scriptaculous\.js(\?.*)?$/)) ! }).each( function(s) { ! var path = s.src.replace(/scriptaculous\.js(\?.*)?$/,''); ! var includes = s.src.match(/\?.*load=([a-z,]*)/); ! (includes ? includes[1] : 'builder,effects,dragdrop,controls,slider').split(',').each( ! function(include) { Scriptaculous.require(path+include+'.js') }); ! }); } } Index: sandbox.js =================================================================== RCS file: /cvsroot/exist/eXist-1.0/webapp/sandbox/scripts/sandbox.js,v retrieving revision 1.7 retrieving revision 1.8 diff -C2 -d -r1.7 -r1.8 *** sandbox.js 18 Dec 2005 19:49:22 -0000 1.7 --- sandbox.js 31 Dec 2005 18:38:35 -0000 1.8 *************** *** 51,59 **** if (!document.savePanelShow) { this.innerHTML = "Hide Options"; ! Effect.BlindDown('save-panel'); document.savePanelShow = true; } else { this.innerHTML = "Show Options"; ! Effect.BlindUp('save-panel'); document.savePanelShow = false; } --- 51,61 ---- if (!document.savePanelShow) { this.innerHTML = "Hide Options"; ! //Effect.BlindDown('save-panel'); ! Effect.toggle('save-panel', 'blind'); document.savePanelShow = true; } else { this.innerHTML = "Show Options"; ! //Effect.BlindUp('save-panel'); ! Effect.toggle('save-panel', 'blind'); document.savePanelShow = false; } Index: ajax.js =================================================================== RCS file: /cvsroot/exist/eXist-1.0/webapp/sandbox/scripts/ajax.js,v retrieving revision 1.2 retrieving revision 1.3 diff -C2 -d -r1.2 -r1.3 *** ajax.js 18 Dec 2005 19:49:22 -0000 1.2 --- ajax.js 31 Dec 2005 18:38:35 -0000 1.3 *************** *** 10,15 **** var html = '<div id="xmldb-open">' + ! ' <a href="#" id="xmldb-close">Close</a>' + ! ' <h1>Open Resource</h1>' + ' <div id="xmldb-inner">' + ' <input type="text" name="path" id="xmldb-path" />' + --- 10,15 ---- var html = '<div id="xmldb-open">' + ! ' <a href="#" id="xmldb-close"><img src="window_close_grey.gif" border="0"/></a>' + ! ' <h1 id="xmldb-title">Open Resource</h1>' + ' <div id="xmldb-inner">' + ' <input type="text" name="path" id="xmldb-path" />' + *************** *** 27,30 **** --- 27,31 ---- div.style.left = '25%'; div.style.top = '25%'; + new Draggable(div, { revert: false, handle: 'xmldb-title' }); this.submitButton = $('xmldb-submit'); Index: effects.js =================================================================== RCS file: /cvsroot/exist/eXist-1.0/webapp/sandbox/scripts/effects.js,v retrieving revision 1.3 retrieving revision 1.4 diff -C2 -d -r1.3 -r1.4 *** effects.js 17 Dec 2005 15:31:19 -0000 1.3 --- effects.js 31 Dec 2005 18:38:35 -0000 1.4 *************** *** 23,43 **** } return(color.length==7 ? color : (arguments[0] || this)); ! } ! Element.collectTextNodesIgnoreClass = function(element, ignoreclass) { ! var children = $(element).childNodes; ! var text = ''; ! var classtest = new RegExp('^([^ ]+ )*' + ignoreclass+ '( [^ ]+)*$','i'); ! ! for (var i = 0; i < children.length; i++) { ! if(children[i].nodeType==3) { ! text+=children[i].nodeValue; ! } else { ! if((!children[i].className.match(classtest)) && children[i].hasChildNodes()) ! text += Element.collectTextNodesIgnoreClass(children[i], ignoreclass); ! } ! } ! ! return text; } --- 23,41 ---- } return(color.length==7 ? color : (arguments[0] || this)); ! } ! Element.collectTextNodes = function(element) { ! return $A($(element).childNodes).collect( function(node) { ! return (node.nodeType==3 ? node.nodeValue : ! (node.hasChildNodes() ? Element.collectTextNodes(node) : '')); ! }).flatten().join(''); ! } ! ! Element.collectTextNodesIgnoreClass = function(element, className) { ! return $A($(element).childNodes).collect( function(node) { ! return (node.nodeType==3 ? node.nodeValue : ! ((node.hasChildNodes() && !Element.hasClassName(node,className)) ? ! Element.collectTextNodes(node) : '')); ! }).flatten().join(''); } *************** *** 130,133 **** --- 128,145 ---- new effect(element, Object.extend(options, { delay: index * options.speed + masterDelay })); }); + }, + PAIRS: { + 'slide': ['SlideDown','SlideUp'], + 'blind': ['BlindDown','BlindUp'], + 'appear': ['Appear','Fade'] + }, + toggle: function(element, effect) { + element = $(element); + effect = (effect || 'appear').toLowerCase(); + var options = Object.extend({ + queue: { position:'end', scope:(element.id || 'global') } + }, arguments[2] || {}); + Effect[Element.visible(element) ? + Effect.PAIRS[effect][1] : Effect.PAIRS[effect][0]](element, options); } }; *************** *** 167,180 **** /* ------------- core effects ------------- */ ! Effect.Queue = { ! effects: [], _each: function(iterator) { this.effects._each(iterator); }, - interval: null, add: function(effect) { var timestamp = new Date().getTime(); ! switch(effect.options.queue) { case 'front': // move unstarted effects after this effect --- 179,198 ---- /* ------------- core effects ------------- */ ! Effect.ScopedQueue = Class.create(); ! Object.extend(Object.extend(Effect.ScopedQueue.prototype, Enumerable), { ! initialize: function() { ! this.effects = []; ! this.interval = null; ! }, _each: function(iterator) { this.effects._each(iterator); }, add: function(effect) { var timestamp = new Date().getTime(); ! var position = (typeof effect.options.queue == 'string') ? ! effect.options.queue : effect.options.queue.position; ! ! switch(position) { case 'front': // move unstarted effects after this effect *************** *** 207,230 **** this.effects.invoke('loop', timePos); } } - Object.extend(Effect.Queue, Enumerable); Effect.Base = function() {}; Effect.Base.prototype = { position: null, - setOptions: function(options) { - this.options = Object.extend({ - transition: Effect.Transitions.sinoidal, - duration: 1.0, // seconds - fps: 25.0, // max. 25fps due to Effect.Queue implementation - sync: false, // true for combining - from: 0.0, - to: 1.0, - delay: 0.0, - queue: 'parallel' - }, options || {}); - }, start: function(options) { ! this.setOptions(options || {}); this.currentFrame = 0; this.state = 'idle'; --- 225,259 ---- this.effects.invoke('loop', timePos); } + }); + + Effect.Queues = { + instances: $H(), + get: function(queueName) { + if(typeof queueName != 'string') return queueName; + + if(!this.instances[queueName]) + this.instances[queueName] = new Effect.ScopedQueue(); + + return this.instances[queueName]; + } + } + Effect.Queue = Effect.Queues.get('global'); + + Effect.DefaultOptions = { + transition: Effect.Transitions.sinoidal, + duration: 1.0, // seconds + fps: 25.0, // max. 25fps due to Effect.Queue implementation + sync: false, // true for combining + from: 0.0, + to: 1.0, + delay: 0.0, + queue: 'parallel' } Effect.Base = function() {}; Effect.Base.prototype = { position: null, start: function(options) { ! this.options = Object.extend(Object.extend({},Effect.DefaultOptions), options || {}); this.currentFrame = 0; this.state = 'idle'; *************** *** 232,236 **** this.finishOn = this.startOn + (this.options.duration*1000); this.event('beforeStart'); ! if(!this.options.sync) Effect.Queue.add(this); }, loop: function(timePos) { --- 261,267 ---- this.finishOn = this.startOn + (this.options.duration*1000); this.event('beforeStart'); ! if(!this.options.sync) ! Effect.Queues.get(typeof this.options.queue == 'string' ? ! 'global' : this.options.queue.scope).add(this); }, loop: function(timePos) { *************** *** 270,274 **** }, cancel: function() { ! if(!this.options.sync) Effect.Queue.remove(this); this.state = 'finished'; }, --- 301,307 ---- }, cancel: function() { ! if(!this.options.sync) ! Effect.Queues.get(typeof this.options.queue == 'string' ? ! 'global' : this.options.queue.scope).remove(this); this.state = 'finished'; }, *************** *** 320,330 **** }); ! Effect.MoveBy = Class.create(); ! Object.extend(Object.extend(Effect.MoveBy.prototype, Effect.Base.prototype), { ! initialize: function(element, toTop, toLeft) { ! this.element = $(element); ! this.toTop = toTop; ! this.toLeft = toLeft; ! this.start(arguments[3]); }, setup: function() { --- 353,366 ---- }); ! Effect.Move = Class.create(); ! Object.extend(Object.extend(Effect.Move.prototype, Effect.Base.prototype), { ! initialize: function(element) { ! this.element = $(element); ! var options = Object.extend({ ! x: 0, ! y: 0, ! mode: 'relative' ! }, arguments[1] || {}); ! this.start(options); }, setup: function() { *************** *** 334,348 **** // (to 0 if you do not need them) Element.makePositioned(this.element); - this.originalTop = parseFloat(Element.getStyle(this.element,'top') || '0'); this.originalLeft = parseFloat(Element.getStyle(this.element,'left') || '0'); }, update: function(position) { Element.setStyle(this.element, { ! top: this.toTop * position + this.originalTop + 'px', ! left: this.toLeft * position + this.originalLeft + 'px' }); } }); Effect.Scale = Class.create(); Object.extend(Object.extend(Effect.Scale.prototype, Effect.Base.prototype), { --- 370,395 ---- // (to 0 if you do not need them) Element.makePositioned(this.element); this.originalLeft = parseFloat(Element.getStyle(this.element,'left') || '0'); + this.originalTop = parseFloat(Element.getStyle(this.element,'top') || '0'); + if(this.options.mode == 'absolute') { + // absolute movement, so we need to calc deltaX and deltaY + this.options.x = this.options.x - this.originalLeft; + this.options.y = this.options.y - this.originalTop; + } }, update: function(position) { Element.setStyle(this.element, { ! left: this.options.x * position + this.originalLeft + 'px', ! top: this.options.y * position + this.originalTop + 'px' }); } }); + // for backwards compatibility + Effect.MoveBy = function(element, toTop, toLeft) { + return new Effect.Move(element, + Object.extend({ x: toLeft, y: toTop }, arguments[3] || {})); + }; + Effect.Scale = Class.create(); Object.extend(Object.extend(Effect.Scale.prototype, Effect.Base.prototype), { *************** *** 586,590 **** opacity: Element.getInlineOpacity(element) }; return new Effect.Parallel( ! [ new Effect.MoveBy(element, 100, 0, { sync: true }), new Effect.Opacity(element, { sync: true, to: 0.0 }) ], Object.extend( --- 633,637 ---- opacity: Element.getInlineOpacity(element) }; return new Effect.Parallel( ! [ new Effect.Move(element, {x: 0, y: 100, sync: true }), new Effect.Opacity(element, { sync: true, to: 0.0 }) ], Object.extend( *************** *** 603,618 **** top: Element.getStyle(element, 'top'), left: Element.getStyle(element, 'left') }; ! return new Effect.MoveBy(element, 0, 20, ! { duration: 0.05, afterFinishInternal: function(effect) { ! new Effect.MoveBy(effect.element, 0, -40, ! { duration: 0.1, afterFinishInternal: function(effect) { ! new Effect.MoveBy(effect.element, 0, 40, ! { duration: 0.1, afterFinishInternal: function(effect) { ! new Effect.MoveBy(effect.element, 0, -40, ! { duration: 0.1, afterFinishInternal: function(effect) { ! new Effect.MoveBy(effect.element, 0, 40, ! { duration: 0.1, afterFinishInternal: function(effect) { ! new Effect.MoveBy(effect.element, 0, -20, ! { duration: 0.05, afterFinishInternal: function(effect) { with(Element) { undoPositioned(effect.element); setStyle(effect.element, oldStyle); --- 650,665 ---- top: Element.getStyle(element, 'top'), left: Element.getStyle(element, 'left') }; ! return new Effect.Move(element, ! { x: 20, y: 0, duration: 0.05, afterFinishInternal: function(effect) { ! new Effect.Move(effect.element, ! { x: -40, y: 0, duration: 0.1, afterFinishInternal: function(effect) { ! new Effect.Move(effect.element, ! { x: 40, y: 0, duration: 0.1, afterFinishInternal: function(effect) { ! new Effect.Move(effect.element, ! { x: -40, y: 0, duration: 0.1, afterFinishInternal: function(effect) { ! new Effect.Move(effect.element, ! { x: 40, y: 0, duration: 0.1, afterFinishInternal: function(effect) { ! new Effect.Move(effect.element, ! { x: -20, y: 0, duration: 0.05, afterFinishInternal: function(effect) { with(Element) { undoPositioned(effect.element); setStyle(effect.element, oldStyle); *************** *** 738,742 **** } ! return new Effect.MoveBy(element, initialMoveY, initialMoveX, { duration: 0.01, beforeSetup: function(effect) { with(Element) { --- 785,791 ---- } ! return new Effect.Move(element, { ! x: initialMoveX, ! y: initialMoveY, duration: 0.01, beforeSetup: function(effect) { with(Element) { *************** *** 748,752 **** new Effect.Parallel( [ new Effect.Opacity(effect.element, { sync: true, to: 1.0, from: 0.0, transition: options.opacityTransition }), ! new Effect.MoveBy(effect.element, moveY, moveX, { sync: true, transition: options.moveTransition }), new Effect.Scale(effect.element, 100, { scaleMode: { originalHeight: dims.height, originalWidth: dims.width }, --- 797,801 ---- new Effect.Parallel( [ new Effect.Opacity(effect.element, { sync: true, to: 1.0, from: 0.0, transition: options.opacityTransition }), ! new Effect.Move(effect.element, { x: moveX, y: moveY, sync: true, transition: options.moveTransition }), new Effect.Scale(effect.element, 100, { scaleMode: { originalHeight: dims.height, originalWidth: dims.width }, *************** *** 808,812 **** [ new Effect.Opacity(element, { sync: true, to: 0.0, from: 1.0, transition: options.opacityTransition }), new Effect.Scale(element, window.opera ? 1 : 0, { sync: true, transition: options.scaleTransition, restoreAfterFinish: true}), ! new Effect.MoveBy(element, moveY, moveX, { sync: true, transition: options.moveTransition }) ], Object.extend({ beforeStartInternal: function(effect) { with(Element) { --- 857,861 ---- [ new Effect.Opacity(element, { sync: true, to: 0.0, from: 1.0, transition: options.opacityTransition }), new Effect.Scale(element, window.opera ? 1 : 0, { sync: true, transition: options.scaleTransition, restoreAfterFinish: true}), ! new Effect.Move(element, { x: moveX, y: moveY, sync: true, transition: options.moveTransition }) ], Object.extend({ beforeStartInternal: function(effect) { with(Element) { |
|
From: Wolfgang M. M. <wol...@us...> - 2005-12-31 18:38:43
|
Update of /cvsroot/exist/eXist-1.0/webapp/sandbox/styles In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv27914/webapp/sandbox/styles Modified Files: sandbox.css Log Message: A few visual updates for the sandbox. Index: sandbox.css =================================================================== RCS file: /cvsroot/exist/eXist-1.0/webapp/sandbox/styles/sandbox.css,v retrieving revision 1.5 retrieving revision 1.6 diff -C2 -d -r1.5 -r1.6 *** sandbox.css 18 Dec 2005 19:49:22 -0000 1.5 --- sandbox.css 31 Dec 2005 18:38:35 -0000 1.6 *************** *** 51,54 **** --- 51,59 ---- } + button { + border: 1px solid #FFB900; + background-color: white; + } + h2 { font-size: 100%; *************** *** 59,64 **** width: 140px; height: 216px; ! background-color: #FC6; ! border: 1px solid #FFD100; padding: 5px 2px 5px 2px; } --- 64,69 ---- width: 140px; height: 216px; ! background-color: #FFE; ! border: 1px solid #FFB900; padding: 5px 2px 5px 2px; } *************** *** 66,71 **** #slots-panel h2 { font-weight: normal; ! color: white; ! border-bottom: 1px solid white; margin-bottom: 5px; padding: 0 0 5px 5px; --- 71,76 ---- #slots-panel h2 { font-weight: normal; ! color: black; ! border-bottom: 1px solid #FFB900; margin-bottom: 5px; padding: 0 0 5px 5px; *************** *** 81,85 **** #slots .num { float: left; ! color: white; width: 30px; text-align: right; --- 86,90 ---- #slots .num { float: left; ! color: #FFB900; width: 30px; text-align: right; *************** *** 96,100 **** #right-panel { margin-left: 150px; ! border: 1px solid #FFD100; } --- 101,105 ---- #right-panel { margin-left: 150px; ! border: 1px solid #FFB900; } *************** *** 148,156 **** #save-panel { padding: 0 3px 3px 5px; ! background-color: #FEFFAF; } #save-panel h2 { margin: 3px 0; } --- 153,165 ---- #save-panel { padding: 0 3px 3px 5px; ! margin-top: 5px; ! background-color: #FFE; ! border: 1px solid #FFB900; } #save-panel h2 { margin: 3px 0; + font-style: italic; + font-weight: normal; } *************** *** 159,162 **** --- 168,175 ---- } + #save-panel #export-resource { + margin-right: 10px; + } + #description, #path { min-width: 400px; *************** *** 230,236 **** #xmldb-open { width: 40%; ! background-color: #EEEEEE; font-size: 75%; ! border: 2px solid #999999; } --- 243,249 ---- #xmldb-open { width: 40%; ! background-color: white; font-size: 75%; ! border: 5px solid #DAD6D6; } |
|
From: Wolfgang M. M. <wol...@us...> - 2005-12-31 17:32:58
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/xquery/functions In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv31979/src/org/exist/xquery/functions Modified Files: FunMatches.java FunReplace.java Log Message: Fixed: argument checks for fn:replace. Index: FunMatches.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/functions/FunMatches.java,v retrieving revision 1.20 retrieving revision 1.21 diff -C2 -d -r1.20 -r1.21 *** FunMatches.java 30 Dec 2005 12:51:44 -0000 1.20 --- FunMatches.java 31 Dec 2005 17:32:50 -0000 1.21 *************** *** 121,125 **** steps.add(arg); ! if (getArgumentCount() == 3) { arg = (Expression) arguments.get(2); arg = new DynamicCardinalityCheck(context, Cardinality.EXACTLY_ONE, arg, --- 121,125 ---- steps.add(arg); ! if (arguments.size() == 3) { arg = (Expression) arguments.get(2); arg = new DynamicCardinalityCheck(context, Cardinality.EXACTLY_ONE, arg, Index: FunReplace.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/functions/FunReplace.java,v retrieving revision 1.9 retrieving revision 1.10 diff -C2 -d -r1.9 -r1.10 *** FunReplace.java 13 Dec 2005 23:01:09 -0000 1.9 --- FunReplace.java 31 Dec 2005 17:32:50 -0000 1.10 *************** *** 23,32 **** --- 23,36 ---- package org.exist.xquery.functions; + import java.util.List; import java.util.regex.Pattern; import java.util.regex.PatternSyntaxException; import org.exist.dom.QName; + import org.exist.xquery.Atomize; import org.exist.xquery.Cardinality; import org.exist.xquery.Dependency; + import org.exist.xquery.DynamicCardinalityCheck; + import org.exist.xquery.Expression; import org.exist.xquery.Function; import org.exist.xquery.FunctionSignature; *************** *** 34,37 **** --- 38,42 ---- import org.exist.xquery.XPathException; import org.exist.xquery.XQueryContext; + import org.exist.xquery.util.Error; import org.exist.xquery.value.Item; import org.exist.xquery.value.Sequence; *************** *** 79,82 **** --- 84,122 ---- /* (non-Javadoc) + * @see org.exist.xquery.Function#setArguments(java.util.List) + */ + public void setArguments(List arguments) throws XPathException { + Expression arg = (Expression) arguments.get(0); + arg = new DynamicCardinalityCheck(context, Cardinality.ZERO_OR_ONE, arg, + new Error(Error.FUNC_PARAM_CARDINALITY, "1", mySignature)); + if(!Type.subTypeOf(arg.returnsType(), Type.ATOMIC)) + arg = new Atomize(context, arg); + steps.add(arg); + + arg = (Expression) arguments.get(1); + arg = new DynamicCardinalityCheck(context, Cardinality.EXACTLY_ONE, arg, + new Error(Error.FUNC_PARAM_CARDINALITY, "2", mySignature)); + if(!Type.subTypeOf(arg.returnsType(), Type.ATOMIC)) + arg = new Atomize(context, arg); + steps.add(arg); + + arg = (Expression) arguments.get(2); + arg = new DynamicCardinalityCheck(context, Cardinality.EXACTLY_ONE, arg, + new Error(Error.FUNC_PARAM_CARDINALITY, "3", mySignature)); + if(!Type.subTypeOf(arg.returnsType(), Type.ATOMIC)) + arg = new Atomize(context, arg); + steps.add(arg); + + if (arguments.size() == 4) { + arg = (Expression) arguments.get(3); + arg = new DynamicCardinalityCheck(context, Cardinality.EXACTLY_ONE, arg, + new Error(Error.FUNC_PARAM_CARDINALITY, "4", mySignature)); + if(!Type.subTypeOf(arg.returnsType(), Type.ATOMIC)) + arg = new Atomize(context, arg); + steps.add(arg); + } + } + + /* (non-Javadoc) * @see org.exist.xquery.Expression#eval(org.exist.dom.DocumentSet, org.exist.xquery.value.Sequence, org.exist.xquery.value.Item) */ *************** *** 90,94 **** context.getProfiler().message(this, Profiler.START_SEQUENCES, "CONTEXT ITEM", contextItem.toSequence()); } ! Sequence result; Sequence stringArg = getArgument(0).eval(contextSequence, contextItem); --- 130,134 ---- context.getProfiler().message(this, Profiler.START_SEQUENCES, "CONTEXT ITEM", contextItem.toSequence()); } ! Sequence result; Sequence stringArg = getArgument(0).eval(contextSequence, contextItem); |
|
From: Pierrick B. <br...@us...> - 2005-12-31 13:02:53
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/storage In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv26375/src/org/exist/storage Modified Files: NativeTextEngine.java Log Message: Refactored local variables Index: NativeTextEngine.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/storage/NativeTextEngine.java,v retrieving revision 1.124 retrieving revision 1.125 diff -C2 -d -r1.124 -r1.125 *** NativeTextEngine.java 31 Dec 2005 09:07:32 -0000 1.124 --- NativeTextEngine.java 31 Dec 2005 13:02:41 -0000 1.125 *************** *** 160,167 **** public void storeAttribute(FulltextIndexSpec indexSpec, AttrImpl attr) { final DocumentImpl doc = (DocumentImpl) attr.getOwnerDocument(); ! final long gid = attr.getGID(); ! TextToken token; //TODO : case conversion should be handled by the tokenizer -pb ! tokenizer.setText(attr.getValue().toLowerCase()); while (null != (token = tokenizer.nextToken())) { if (token.length() > MAX_TOKEN_LENGTH) { --- 160,167 ---- public void storeAttribute(FulltextIndexSpec indexSpec, AttrImpl attr) { final DocumentImpl doc = (DocumentImpl) attr.getOwnerDocument(); ! final long gid = attr.getGID(); //TODO : case conversion should be handled by the tokenizer -pb ! tokenizer.setText(attr.getValue().toLowerCase()); ! TextToken token; while (null != (token = tokenizer.nextToken())) { if (token.length() > MAX_TOKEN_LENGTH) { *************** *** 316,347 **** public void dropIndex(DocumentImpl document) { //Collect document's tokens ! TreeSet tokens = new TreeSet(); ! NodeList children = document.getChildNodes(); ! NodeImpl node; for (int i = 0; i < children.getLength(); i++) { ! node = (NodeImpl) children.item(i); Iterator j = broker.getDOMIterator(new NodeProxy(document, node.getGID(), node.getInternalAddress())); collect(tokens, j); } - - String token; - WordRef ref; - int gidsCount; - byte section; - //TOUNDERSTAND -pb - int size; - VariableByteInput is; - int storedDocId; - boolean changed; short collectionId = document.getCollection().getId(); final Lock lock = dbTokens.getLock(); for (Iterator iter = tokens.iterator(); iter.hasNext();) { ! token = (String) iter.next(); ! ref = new WordRef(collectionId, token); try { lock.acquire(Lock.WRITE_LOCK); ! changed = false; os.clear(); ! is = dbTokens.getAsStream(ref); //Does the token already has data in the index ? if (is == null) --- 316,336 ---- public void dropIndex(DocumentImpl document) { //Collect document's tokens ! final TreeSet tokens = new TreeSet(); ! final NodeList children = document.getChildNodes(); for (int i = 0; i < children.getLength(); i++) { ! NodeImpl node = (NodeImpl) children.item(i); Iterator j = broker.getDOMIterator(new NodeProxy(document, node.getGID(), node.getInternalAddress())); collect(tokens, j); } short collectionId = document.getCollection().getId(); final Lock lock = dbTokens.getLock(); for (Iterator iter = tokens.iterator(); iter.hasNext();) { ! String token = (String) iter.next(); ! WordRef ref = new WordRef(collectionId, token); try { lock.acquire(Lock.WRITE_LOCK); ! boolean changed = false; os.clear(); ! VariableByteInput is = dbTokens.getAsStream(ref); //Does the token already has data in the index ? if (is == null) *************** *** 349,356 **** try { while (is.available() > 0) { ! storedDocId = is.readInt(); ! section = is.readByte(); ! gidsCount = is.readInt(); ! size = is.readFixedInt(); if (storedDocId != document.getDocId()) { // data are related to another document: --- 338,346 ---- try { while (is.available() > 0) { ! int storedDocId = is.readInt(); ! byte section = is.readByte(); ! int gidsCount = is.readInt(); ! //TOUNDERSTAND -pb ! int size = is.readFixedInt(); if (storedDocId != document.getDocId()) { // data are related to another document: *************** *** 435,476 **** else token = expr; ! ! NodeSet result = new ExtArrayNodeSet(docs.getLength(), 250); ! Value ref; ! int storedDocId; ! int storedSection; ! int freq; ! int gidsCount; ! long storedGID; ! long previousGID; ! long delta; ! //TOUNDERSTAND -pb ! int size; ! VariableByteInput is; ! Collection collection; ! short collectionId; ! DocumentImpl storedDocument; ! NodeProxy storedNode; ! NodeProxy parent; ! int sizeHint; ! Match match; for (Iterator iter = docs.getCollectionIterator(); iter.hasNext();) { //Compute a key for the node ! collection = (Collection) iter.next(); ! collectionId = collection.getId(); ! ref = new WordRef(collectionId, token); Lock lock = dbTokens.getLock(); try { lock.acquire(); ! is = dbTokens.getAsStream(ref); //Does the token already has data in the index ? if (is == null) continue; while (is.available() > 0) { ! storedDocId = is.readInt(); ! storedSection = is.readByte(); ! gidsCount = is.readInt(); ! size = is.readFixedInt(); ! storedDocument = docs.getDoc(storedDocId); //Exit if the document is not concerned if (storedDocument == null) { --- 425,448 ---- else token = expr; ! final NodeSet result = new ExtArrayNodeSet(docs.getLength(), 250); for (Iterator iter = docs.getCollectionIterator(); iter.hasNext();) { //Compute a key for the node ! Collection collection = (Collection) iter.next(); ! short collectionId = collection.getId(); ! Value ref = new WordRef(collectionId, token); Lock lock = dbTokens.getLock(); try { lock.acquire(); ! VariableByteInput is = dbTokens.getAsStream(ref); //Does the token already has data in the index ? if (is == null) continue; while (is.available() > 0) { ! int storedDocId = is.readInt(); ! int storedSection = is.readByte(); ! int gidsCount = is.readInt(); ! //TOUNDERSTAND -pb ! int size = is.readFixedInt(); ! DocumentImpl storedDocument = docs.getDoc(storedDocId); //Exit if the document is not concerned if (storedDocument == null) { *************** *** 486,494 **** } //Process the nodes ! previousGID = 0; for (int j = 0; j < gidsCount; j++) { ! delta = is.readLong(); ! storedGID = previousGID + delta; ! freq = is.readInt(); switch (storedSection) { case ATTRIBUTE_SECTION : --- 458,467 ---- } //Process the nodes ! long previousGID = 0; for (int j = 0; j < gidsCount; j++) { ! long delta = is.readLong(); ! long storedGID = previousGID + delta; ! int freq = is.readInt(); ! NodeProxy storedNode; switch (storedSection) { case ATTRIBUTE_SECTION : *************** *** 505,508 **** --- 478,482 ---- // in the context set. if (contextSet != null) { + NodeProxy parent; switch(storedSection) { case ATTRIBUTE_SECTION : *************** *** 516,523 **** } if (parent != null) { ! match = new Match(storedGID, token, freq); readOccurrences(freq, is, match, token.length()); parent.addMatch(match); ! sizeHint = contextSet.getSizeHint(storedDocument); result.add(parent, sizeHint); } else { --- 490,497 ---- } if (parent != null) { ! Match match = new Match(storedGID, token, freq); readOccurrences(freq, is, match, token.length()); parent.addMatch(match); ! int sizeHint = contextSet.getSizeHint(storedDocument); result.add(parent, sizeHint); } else { *************** *** 526,530 **** // otherwise, we add all text nodes without check } else { ! match = new Match(storedGID, token, freq); readOccurrences(freq, is, match, token.length()); storedNode.addMatch(match); --- 500,504 ---- // otherwise, we add all text nodes without check } else { ! Match match = new Match(storedGID, token, freq); readOccurrences(freq, is, match, token.length()); storedNode.addMatch(match); *************** *** 541,546 **** LOG.warn("Failed to acquire lock for '" + dbTokens.getFile().getName() + "'", e); } catch (IOException e) { ! LOG.error(e.getMessage(), e); ! is = null; //TODO : return ? } finally { --- 515,519 ---- LOG.warn("Failed to acquire lock for '" + dbTokens.getFile().getName() + "'", e); } catch (IOException e) { ! LOG.error(e.getMessage(), e); //TODO : return ? } finally { *************** *** 563,567 **** // if the regexp starts with a char sequence, we restrict the index scan to entries starting with // the same sequence. Otherwise, we have to scan the whole index. ! StringBuffer token = new StringBuffer(); for (int i = 0; i < expr.length(); i++) { if (Character.isLetterOrDigit(expr.charAt(i))) --- 536,540 ---- // if the regexp starts with a char sequence, we restrict the index scan to entries starting with // the same sequence. Otherwise, we have to scan the whole index. ! final StringBuffer token = new StringBuffer(); for (int i = 0; i < expr.length(); i++) { if (Character.isLetterOrDigit(expr.charAt(i))) *************** *** 583,596 **** public NodeSet getNodes(XQueryContext context, DocumentSet docs, NodeSet contextSet, TermMatcher matcher, CharSequence startTerm) throws TerminatedException {; ! NodeSet result = new ExtArrayNodeSet(); ! final SearchCallback cb = new SearchCallback(context, matcher, result, contextSet, docs); ! Value ref; ! Collection collection; ! short collectionId; final Lock lock = dbTokens.getLock(); for (Iterator iter = docs.getCollectionIterator(); iter.hasNext();) { - collection = (Collection) iter.next(); - collectionId = collection.getId(); //Compute a key for the token if (startTerm != null && startTerm.length() > 0) //TODO : case conversion should be handled by the tokenizer -pb --- 556,567 ---- public NodeSet getNodes(XQueryContext context, DocumentSet docs, NodeSet contextSet, TermMatcher matcher, CharSequence startTerm) throws TerminatedException {; ! final NodeSet result = new ExtArrayNodeSet(); ! final SearchCallback cb = new SearchCallback(context, matcher, result, contextSet, docs); final Lock lock = dbTokens.getLock(); for (Iterator iter = docs.getCollectionIterator(); iter.hasNext();) { //Compute a key for the token + Collection collection = (Collection) iter.next(); + short collectionId = collection.getId(); + Value ref; if (startTerm != null && startTerm.length() > 0) //TODO : case conversion should be handled by the tokenizer -pb *************** *** 618,633 **** public String[] getIndexTerms(DocumentSet docs, TermMatcher matcher) { ! final IndexCallback cb = new IndexCallback(null, matcher); ! Value ref; ! IndexQuery query; ! Collection collection; ! short collectionId; final Lock lock = dbTokens.getLock(); for (Iterator iter = docs.getCollectionIterator(); iter.hasNext();) { - collection = (Collection) iter.next(); - collectionId = collection.getId(); //Compute a key for the token ! ref = new WordRef(collectionId); ! query = new IndexQuery(IndexQuery.TRUNC_RIGHT, ref); try { lock.acquire(); --- 589,600 ---- public String[] getIndexTerms(DocumentSet docs, TermMatcher matcher) { ! final IndexCallback cb = new IndexCallback(null, matcher); final Lock lock = dbTokens.getLock(); for (Iterator iter = docs.getCollectionIterator(); iter.hasNext();) { //Compute a key for the token ! Collection collection = (Collection) iter.next(); ! short collectionId = collection.getId(); ! Value ref = new WordRef(collectionId); ! IndexQuery query = new IndexQuery(IndexQuery.TRUNC_RIGHT, ref); try { lock.acquire(); *************** *** 651,664 **** throws PermissionDeniedException { final IndexScanCallback cb = new IndexScanCallback(docs, contextSet); - Value startRef; - Value endRef; - IndexQuery query; - Collection collection; - short collectionId; final Lock lock = dbTokens.getLock(); for (Iterator i = docs.getCollectionIterator(); i.hasNext();) { - collection = (Collection) i.next(); - collectionId = collection.getId(); //Compute a key for the token if (end == null) { startRef = new WordRef(collectionId, start.toLowerCase()); --- 618,629 ---- throws PermissionDeniedException { final IndexScanCallback cb = new IndexScanCallback(docs, contextSet); final Lock lock = dbTokens.getLock(); for (Iterator i = docs.getCollectionIterator(); i.hasNext();) { //Compute a key for the token + Collection collection = (Collection) i.next(); + short collectionId = collection.getId(); + Value startRef; + Value endRef; + IndexQuery query; if (end == null) { startRef = new WordRef(collectionId, start.toLowerCase()); *************** *** 713,718 **** byte[] data = ((Value) domIterator.next()).getData(); short type = Signatures.getType(data[0]); - String word; - TextToken token; switch (type) { case Node.ELEMENT_NODE : --- 678,681 ---- *************** *** 726,731 **** s = new String(data, 1, data.length - 1, "UTF-8"); tokenizer.setText(s); while (null != (token = tokenizer.nextToken())) { ! word = token.getText(); if (stoplist.contains(word)) continue; --- 689,695 ---- s = new String(data, 1, data.length - 1, "UTF-8"); tokenizer.setText(s); + TextToken token; while (null != (token = tokenizer.nextToken())) { ! String word = token.getText(); if (stoplist.contains(word)) continue; *************** *** 747,752 **** "UTF-8"); tokenizer.setText(val); while (null != (token = tokenizer.nextToken())) { ! word = token.getText().toString(); if (stoplist.contains(word)) continue; --- 711,717 ---- "UTF-8"); tokenizer.setText(val); + TextToken token; while (null != (token = tokenizer.nextToken())) { ! String word = token.getText().toString(); if (stoplist.contains(word)) continue; *************** *** 844,866 **** if (wordsCount == 0) return; - final ProgressIndicator progress = new ProgressIndicator(wordsCount, 100); final short collectionId = this.doc.getCollection().getId(); - OccurrenceList occurences; - int termCount; - long previousGID; - long delta; - //TOUNDERSTAND -pb - int lenOffset; - int freq; - Map.Entry entry; - String token; int count = 0; for (byte currentSection = 0; currentSection <= ATTRIBUTE_SECTION; currentSection++) { for (Iterator i = words[currentSection].entrySet().iterator(); i.hasNext(); count++) { ! entry = (Map.Entry) i.next(); ! token = (String) entry.getKey(); ! occurences = (OccurrenceList) entry.getValue(); ! termCount = occurences.getTermCount(); //Don't forget this one occurences.sort(); --- 809,821 ---- if (wordsCount == 0) return; final ProgressIndicator progress = new ProgressIndicator(wordsCount, 100); final short collectionId = this.doc.getCollection().getId(); int count = 0; for (byte currentSection = 0; currentSection <= ATTRIBUTE_SECTION; currentSection++) { for (Iterator i = words[currentSection].entrySet().iterator(); i.hasNext(); count++) { ! Map.Entry entry = (Map.Entry) i.next(); ! String token = (String) entry.getKey(); ! OccurrenceList occurences = (OccurrenceList) entry.getValue(); ! int termCount = occurences.getTermCount(); //Don't forget this one occurences.sort(); *************** *** 878,888 **** } os.writeInt(termCount); ! lenOffset = os.position(); os.writeFixedInt(0); ! previousGID = 0; for (int j = 0; j < occurences.getSize(); ) { ! delta = occurences.nodes[j] - previousGID; os.writeLong(delta); ! freq = occurences.getOccurrences(j); os.writeInt(freq); for (int k = 0; k < freq; k++) { --- 833,844 ---- } os.writeInt(termCount); ! //TOUNDERSTAND -pb ! int lenOffset = os.position(); os.writeFixedInt(0); ! long previousGID = 0; for (int j = 0; j < occurences.getSize(); ) { ! long delta = occurences.nodes[j] - previousGID; os.writeLong(delta); ! int freq = occurences.getOccurrences(j); os.writeInt(freq); for (int k = 0; k < freq; k++) { *************** *** 938,959 **** //Return early if (doc == null) ! return; ! OccurrenceList storedOccurencesList; ! OccurrenceList newOccurencesList; ! int termCount; ! long storedGID; ! long previousGID; ! long delta; ! Map.Entry entry; ! String token; ! WordRef ref; ! Value value; ! VariableByteArrayInput is; ! //TOUNDERSTAND -pb ! int size; ! int lenOffset; ! int storedDocId; ! byte storedSection; ! int freq; final short collectionId = this.doc.getCollection().getId(); final Lock lock = dbTokens.getLock(); --- 894,898 ---- //Return early if (doc == null) ! return; final short collectionId = this.doc.getCollection().getId(); final Lock lock = dbTokens.getLock(); *************** *** 961,983 **** for (Iterator i = words[currentSection].entrySet().iterator(); i.hasNext();) { //Compute a key for the token ! entry = (Map.Entry) i.next(); ! storedOccurencesList = (OccurrenceList) entry.getValue(); ! token = (String) entry.getKey(); ! ref = new WordRef(collectionId, token); ! newOccurencesList = new OccurrenceList(); os.clear(); try { lock.acquire(Lock.WRITE_LOCK); ! value = dbTokens.get(ref); //Does the token already has data in the index ? if (value != null) { //Add its data to the new list ! is = new VariableByteArrayInput(value.getData()); try { while (is.available() > 0) { ! storedDocId = is.readInt(); ! storedSection = is.readByte(); ! termCount = is.readInt(); ! size = is.readFixedInt(); if (storedSection != currentSection || storedDocId != this.doc.getDocId()) { // data are related to another section or document: --- 900,923 ---- for (Iterator i = words[currentSection].entrySet().iterator(); i.hasNext();) { //Compute a key for the token ! Map.Entry entry = (Map.Entry) i.next(); ! OccurrenceList storedOccurencesList = (OccurrenceList) entry.getValue(); ! String token = (String) entry.getKey(); ! WordRef ref = new WordRef(collectionId, token); ! OccurrenceList newOccurencesList = new OccurrenceList(); os.clear(); try { lock.acquire(Lock.WRITE_LOCK); ! Value value = dbTokens.get(ref); //Does the token already has data in the index ? if (value != null) { //Add its data to the new list ! VariableByteArrayInput is = new VariableByteArrayInput(value.getData()); try { while (is.available() > 0) { ! int storedDocId = is.readInt(); ! byte storedSection = is.readByte(); ! int termCount = is.readInt(); ! //TOUNDERSTAND -pb ! int size = is.readFixedInt(); if (storedSection != currentSection || storedDocId != this.doc.getDocId()) { // data are related to another section or document: *************** *** 991,999 **** // data are related to our section and document: // feed the new list with the GIDs ! previousGID = 0; for (int j = 0; j < termCount; j++) { ! delta = is.readLong(); ! storedGID = previousGID + delta; ! freq = is.readInt(); // add the node to the new list if it is not // in the list of removed nodes --- 931,939 ---- // data are related to our section and document: // feed the new list with the GIDs ! long previousGID = 0; for (int j = 0; j < termCount; j++) { ! long delta = is.readLong(); ! long storedGID = previousGID + delta; ! int freq = is.readInt(); // add the node to the new list if it is not // in the list of removed nodes *************** *** 1015,1019 **** //append the data from the new list if(newOccurencesList.getSize() > 0) { ! termCount = newOccurencesList.getTermCount(); //Don't forget this one newOccurencesList.sort(); --- 955,959 ---- //append the data from the new list if(newOccurencesList.getSize() > 0) { ! int termCount = newOccurencesList.getTermCount(); //Don't forget this one newOccurencesList.sort(); *************** *** 1030,1040 **** } os.writeInt(termCount); ! lenOffset = os.position(); os.writeFixedInt(0); ! previousGID = 0; for (int m = 0; m < newOccurencesList.getSize(); ) { ! delta = newOccurencesList.nodes[m] - previousGID; os.writeLong(delta); ! freq = newOccurencesList.getOccurrences(m); os.writeInt(freq); for (int n = 0; n < freq; n++) { --- 970,981 ---- } os.writeInt(termCount); ! //TOUNDERSTAND -pb ! int lenOffset = os.position(); os.writeFixedInt(0); ! long previousGID = 0; for (int m = 0; m < newOccurencesList.getSize(); ) { ! long delta = newOccurencesList.nodes[m] - previousGID; os.writeLong(delta); ! int freq = newOccurencesList.getOccurrences(m); os.writeInt(freq); for (int n = 0; n < freq; n++) { *************** *** 1075,1107 **** } ! public void reindex(DocumentImpl document, NodeImpl node) { ! OccurrenceList storedOccurencesList; ! int termCount; ! long storedGID; ! long previousGID; ! long delta; ! Map.Entry entry; ! String token; ! WordRef ref; ! VariableByteInput is; ! //TOUNDERSTAND -pb ! int size; ! int lenOffset; ! int storedDocId; ! byte storedSection; ! int freq; final short collectionId = document.getCollection().getId(); final Lock lock = dbTokens.getLock(); for (byte currentSection = 0; currentSection <= ATTRIBUTE_SECTION; currentSection++) { ! for (Iterator i = words[currentSection].entrySet().iterator(); i.hasNext();) { //Compute a key for the token ! entry = (Map.Entry) i.next(); ! token = (String) entry.getKey(); ! storedOccurencesList = (OccurrenceList) entry.getValue(); ! ref = new WordRef(collectionId, token); ! os.clear(); try { lock.acquire(Lock.WRITE_LOCK); ! is = dbTokens.getAsStream(ref); //Does the token already has data in the index ? if (is != null) { --- 1016,1033 ---- } ! public void reindex(DocumentImpl document, NodeImpl node) { final short collectionId = document.getCollection().getId(); final Lock lock = dbTokens.getLock(); for (byte currentSection = 0; currentSection <= ATTRIBUTE_SECTION; currentSection++) { ! for (Iterator i = words[currentSection].entrySet().iterator(); i.hasNext();) { //Compute a key for the token ! Map.Entry entry = (Map.Entry) i.next(); ! String token = (String) entry.getKey(); ! WordRef ref = new WordRef(collectionId, token); ! OccurrenceList storedOccurencesList = (OccurrenceList) entry.getValue(); ! os.clear(); try { lock.acquire(Lock.WRITE_LOCK); ! VariableByteInput is = dbTokens.getAsStream(ref); //Does the token already has data in the index ? if (is != null) { *************** *** 1109,1116 **** try { while (is.available() > 0) { ! storedDocId = is.readInt(); ! storedSection = is.readByte(); ! termCount = is.readInt(); ! size = is.readFixedInt(); if (storedSection != currentSection || storedDocId != document.getDocId()) { // data are related to another section or document: --- 1035,1043 ---- try { while (is.available() > 0) { ! int storedDocId = is.readInt(); ! byte storedSection = is.readByte(); ! int termCount = is.readInt(); ! //TOUNDERSTAND -pb ! int size = is.readFixedInt(); if (storedSection != currentSection || storedDocId != document.getDocId()) { // data are related to another section or document: *************** *** 1124,1132 **** // data are related to our section and document: // feed the new list with the GIDs ! previousGID = 0; for (int j = 0; j < termCount; j++) { ! delta = is.readLong(); ! storedGID = previousGID + delta; ! freq = is.readInt(); if (node == null) { if (document.getTreeLevel(storedGID) < document.reindexRequired()) { --- 1051,1059 ---- // data are related to our section and document: // feed the new list with the GIDs ! long previousGID = 0; for (int j = 0; j < termCount; j++) { ! long delta = is.readLong(); ! long storedGID = previousGID + delta; ! int freq = is.readInt(); if (node == null) { if (document.getTreeLevel(storedGID) < document.reindexRequired()) { *************** *** 1160,1164 **** if (storedOccurencesList.getSize() > 0) { //append the data from the new list ! termCount = storedOccurencesList.getTermCount(); storedOccurencesList.sort(); os.writeInt(document.getDocId()); --- 1087,1091 ---- if (storedOccurencesList.getSize() > 0) { //append the data from the new list ! int termCount = storedOccurencesList.getTermCount(); storedOccurencesList.sort(); os.writeInt(document.getDocId()); *************** *** 1174,1184 **** } os.writeInt(termCount); ! lenOffset = os.position(); os.writeFixedInt(0); ! previousGID = 0; for (int m = 0; m < storedOccurencesList.getSize(); ) { ! delta = storedOccurencesList.nodes[m] - previousGID; os.writeLong(delta); ! freq = storedOccurencesList.getOccurrences(m); os.writeInt(freq); for (int n = 0; n < freq; n++) { --- 1101,1112 ---- } os.writeInt(termCount); ! //TOUNDERSTAND -pb ! int lenOffset = os.position(); os.writeFixedInt(0); ! long previousGID = 0; for (int m = 0; m < storedOccurencesList.getSize(); ) { ! long delta = storedOccurencesList.nodes[m] - previousGID; os.writeLong(delta); ! int freq = storedOccurencesList.getOccurrences(m); os.writeInt(freq); for (int n = 0; n < freq; n++) { *************** *** 1202,1212 **** } } catch (LockException e) { ! LOG.warn("Failed to acquire lock for '" + dbTokens.getFile().getName() + "'", e); ! is = null; } catch (ReadOnlyException e) { LOG.warn("Read-only error on '" + dbTokens.getFile().getName() + "'", e); } catch (IOException e) { ! LOG.error("io error while reindexing word '" + token + "'"); ! is = null; } finally { lock.release(Lock.WRITE_LOCK); --- 1130,1138 ---- } } catch (LockException e) { ! LOG.warn("Failed to acquire lock for '" + dbTokens.getFile().getName() + "'", e); } catch (ReadOnlyException e) { LOG.warn("Read-only error on '" + dbTokens.getFile().getName() + "'", e); } catch (IOException e) { ! LOG.error("io error while reindexing word '" + token + "'"); } finally { lock.release(Lock.WRITE_LOCK); *************** *** 1241,1245 **** context.proceed(); try { ! String word = new String(key.getData(), 2, key.getLength() - 2, "UTF-8"); if (matcher.matches(word)) matches.add(word); --- 1167,1171 ---- context.proceed(); try { ! final String word = new String(key.getData(), 2, key.getLength() - 2, "UTF-8"); if (matcher.matches(word)) matches.add(word); *************** *** 1278,1308 **** LOG.error(e.getMessage(), e); return true; ! } ! word.reuse(); word = UTF8.decode(key.getData(), 2, key.getLength() - 2, word); ! if (matcher.matches(word)) { ! int storedDocId; ! byte storedSection; ! int termCount; ! long storedGID; ! long previousGID; ! long delta; ! int size; ! DocumentImpl storedDocument; ! NodeProxy storedNode; ! NodeProxy parentNode; ! Match match; ! int freq; ! int sizeHint; try { while (is.available() > 0) { if(context != null) context.proceed(); ! storedDocId = is.readInt(); ! storedSection = is.readByte(); ! termCount = is.readInt(); ! size = is.readFixedInt(); ! storedDocument = docs.getDoc(storedDocId); //Exit if the document is not concerned if (storedDocument == null) { --- 1204,1221 ---- LOG.error(e.getMessage(), e); return true; ! } word.reuse(); word = UTF8.decode(key.getData(), 2, key.getLength() - 2, word); ! if (matcher.matches(word)) { try { while (is.available() > 0) { if(context != null) context.proceed(); ! int storedDocId = is.readInt(); ! byte storedSection = is.readByte(); ! int termCount = is.readInt(); ! //TOUNDERSTAND -pb ! int size = is.readFixedInt(); ! DocumentImpl storedDocument = docs.getDoc(storedDocId); //Exit if the document is not concerned if (storedDocument == null) { *************** *** 1310,1318 **** continue; } ! previousGID = 0; for (int j = 0; j < termCount; j++) { ! delta = is.readLong(); ! storedGID = previousGID + delta; ! freq = is.readInt(); switch (storedSection) { case TEXT_SECTION : --- 1223,1232 ---- continue; } ! long previousGID = 0; for (int j = 0; j < termCount; j++) { ! long delta = is.readLong(); ! long storedGID = previousGID + delta; ! int freq = is.readInt(); ! NodeProxy storedNode; switch (storedSection) { case TEXT_SECTION : *************** *** 1326,1329 **** --- 1240,1244 ---- } if (contextSet != null) { + NodeProxy parentNode; switch (storedSection) { case TEXT_SECTION : *************** *** 1337,1349 **** } if (parentNode != null) { ! match = new Match(storedGID, word.toString(), freq); readOccurrences(freq, is, match, word.length()); parentNode.addMatch(match); ! sizeHint = contextSet.getSizeHint(storedDocument); result.add(parentNode, sizeHint); } else is.skip(freq); } else { ! match = new Match(storedGID, word.toString(), freq); readOccurrences(freq, is, match, word.length()); storedNode.addMatch(match); --- 1252,1264 ---- } if (parentNode != null) { ! Match match = new Match(storedGID, word.toString(), freq); readOccurrences(freq, is, match, word.length()); parentNode.addMatch(match); ! int sizeHint = contextSet.getSizeHint(storedDocument); result.add(parentNode, sizeHint); } else is.skip(freq); } else { ! Match match = new Match(storedGID, word.toString(), freq); readOccurrences(freq, is, match, word.length()); storedNode.addMatch(match); *************** *** 1400,1425 **** LOG.error(e.getMessage(), e); return true; ! } ! ! int storedDocId; ! byte storedSection; ! int termCount; ! long storedGID; ! long previousGID; ! long delta; ! int size; ! DocumentImpl storedDocument; ! NodeProxy parentNode; ! int freq; ! boolean include; ! boolean docAdded; try { while (is.available() > 0) { ! docAdded = false; ! storedDocId = is.readInt(); ! storedSection = is.readByte(); ! termCount = is.readInt(); ! size = is.readFixedInt(); ! storedDocument = docs.getDoc(storedDocId); //Exit if the document is not concerned if (storedDocument == null) { --- 1315,1329 ---- LOG.error(e.getMessage(), e); return true; ! } ! try { while (is.available() > 0) { ! boolean docAdded = false; ! int storedDocId = is.readInt(); ! byte storedSection = is.readByte(); ! int termCount = is.readInt(); ! //TOUNDERSTAND -pb ! int size = is.readFixedInt(); ! DocumentImpl storedDocument = docs.getDoc(storedDocId); //Exit if the document is not concerned if (storedDocument == null) { *************** *** 1427,1439 **** continue; } ! previousGID = 0; for (int j = 0; j < termCount; j++) { ! delta = is.readLong(); ! storedGID = previousGID + delta; ! freq = is.readInt(); is.skip(freq); if (contextSet != null) { ! include = false; ! parentNode = contextSet.parentWithChild(storedDocument, storedGID, false, true); switch (storedSection) { case TEXT_SECTION : --- 1331,1344 ---- continue; } ! long previousGID = 0; for (int j = 0; j < termCount; j++) { ! long delta = is.readLong(); ! long storedGID = previousGID + delta; ! int freq = is.readInt(); ! //TODO : use variable is.skip(freq); if (contextSet != null) { ! boolean include = false; ! NodeProxy parentNode = contextSet.parentWithChild(storedDocument, storedGID, false, true); switch (storedSection) { case TEXT_SECTION : |
|
From: Pierrick B. <br...@us...> - 2005-12-31 12:19:55
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/storage In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv14459/src/org/exist/storage Modified Files: NativeElementIndex.java Log Message: Refactored local variables Index: NativeElementIndex.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/storage/NativeElementIndex.java,v retrieving revision 1.75 retrieving revision 1.76 diff -C2 -d -r1.75 -r1.76 *** NativeElementIndex.java 31 Dec 2005 09:07:32 -0000 1.75 --- NativeElementIndex.java 31 Dec 2005 12:19:47 -0000 1.76 *************** *** 125,129 **** */ public void sync() { ! Lock lock = dbNodes.getLock(); try { lock.acquire(Lock.WRITE_LOCK); --- 125,129 ---- */ public void sync() { ! final Lock lock = dbNodes.getLock(); try { lock.acquire(Lock.WRITE_LOCK); *************** *** 149,172 **** return; final ProgressIndicator progress = new ProgressIndicator(pending.size(), 5); ! QName qname; ! NodeProxy storedNode; ! //TODO : NativeValueIndex uses LongLinkedLists -pb ! ArrayList gids; ! int gidsCount; ! long previousGID; ! long delta; ! //TOUNDERSTAND -pb ! int lenOffset; ! final SymbolTable symbols = broker.getSymbols(); ! Map.Entry entry; ! ElementValue ref; final short collectionId = this.doc.getCollection().getId(); final Lock lock = dbNodes.getLock(); int count = 0; for (Iterator i = pending.entrySet().iterator(); i.hasNext(); count++) { ! entry = (Map.Entry) i.next(); ! qname = (QName) entry.getKey(); ! gids = (ArrayList) entry.getValue(); ! gidsCount = gids.size(); //Don't forget this one FastQSort.sort(gids, 0, gidsCount - 1); --- 149,162 ---- return; final ProgressIndicator progress = new ProgressIndicator(pending.size(), 5); ! final SymbolTable symbols = broker.getSymbols(); final short collectionId = this.doc.getCollection().getId(); final Lock lock = dbNodes.getLock(); int count = 0; for (Iterator i = pending.entrySet().iterator(); i.hasNext(); count++) { ! Map.Entry entry = (Map.Entry) i.next(); ! QName qname = (QName) entry.getKey(); ! //TODO : NativeValueIndex uses LongLinkedLists -pb ! ArrayList gids = (ArrayList) entry.getValue(); ! int gidsCount = gids.size(); //Don't forget this one FastQSort.sort(gids, 0, gidsCount - 1); *************** *** 174,184 **** os.writeInt(this.doc.getDocId()); os.writeInt(gidsCount); ! lenOffset = os.position(); os.writeFixedInt(0); //Compute the GIDs list ! previousGID = 0; for (int j = 0; j < gidsCount; j++) { ! storedNode = (NodeProxy) gids.get(j); ! delta = storedNode.getGID() - previousGID; os.writeLong(delta); StorageAddress.write(storedNode.getInternalAddress(), os); --- 164,175 ---- os.writeInt(this.doc.getDocId()); os.writeInt(gidsCount); ! //TOUNDERSTAND -pb ! int lenOffset = os.position(); os.writeFixedInt(0); //Compute the GIDs list ! long previousGID = 0; for (int j = 0; j < gidsCount; j++) { ! NodeProxy storedNode = (NodeProxy) gids.get(j); ! long delta = storedNode.getGID() - previousGID; os.writeLong(delta); StorageAddress.write(storedNode.getInternalAddress(), os); *************** *** 187,190 **** --- 178,182 ---- os.writeFixedInt(lenOffset, os.position() - lenOffset - 4); //Compute a key for the node + ElementValue ref; if (qname.getNameType() == ElementValue.ATTRIBUTE_ID) { ref = new ElementValue(qname.getNameType(), collectionId, qname.getLocalName()); *************** *** 229,255 **** if (pending.size() == 0) return; - QName qname; - NodeProxy storedNode; - List storedGIDList; - List newGIDList; - int gidsCount; - long storedGID; - long previousGID; - long delta; - Map.Entry entry; - Value searchKey; - Value value; - VariableByteArrayInput is; - //TOUNDERSTAND -pb - int size; - int lenOffset; - int storedDocId; final short collectionId = this.doc.getCollection().getId(); final Lock lock = dbNodes.getLock(); for (Iterator i = pending.entrySet().iterator(); i.hasNext();) { ! entry = (Map.Entry) i.next(); ! storedGIDList = (ArrayList) entry.getValue(); ! qname = (QName) entry.getKey(); //Compute a key for the node if (qname.getNameType() == ElementValue.ATTRIBUTE_ID) { searchKey = new ElementValue(qname.getNameType(), collectionId, qname.getLocalName()); --- 221,232 ---- if (pending.size() == 0) return; final short collectionId = this.doc.getCollection().getId(); final Lock lock = dbNodes.getLock(); for (Iterator i = pending.entrySet().iterator(); i.hasNext();) { ! Map.Entry entry = (Map.Entry) i.next(); ! List storedGIDList = (ArrayList) entry.getValue(); ! QName qname = (QName) entry.getKey(); //Compute a key for the node + Value searchKey; if (qname.getNameType() == ElementValue.ATTRIBUTE_ID) { searchKey = new ElementValue(qname.getNameType(), collectionId, qname.getLocalName()); *************** *** 259,276 **** searchKey = new ElementValue(qname.getNameType(), collectionId, sym, nsSym); } ! newGIDList = new ArrayList(); os.clear(); try { lock.acquire(Lock.WRITE_LOCK); ! value = dbNodes.get(searchKey); //Does the node already exist in the index ? if (value != null) { //Add its data to the new list ! is = new VariableByteArrayInput(value.getData()); try { while (is.available() > 0) { ! storedDocId = is.readInt(); ! gidsCount = is.readInt(); ! size = is.readFixedInt(); if (storedDocId != this.doc.getDocId()) { // data are related to another document: --- 236,254 ---- searchKey = new ElementValue(qname.getNameType(), collectionId, sym, nsSym); } ! List newGIDList = new ArrayList(); os.clear(); try { lock.acquire(Lock.WRITE_LOCK); ! Value value = dbNodes.get(searchKey); //Does the node already exist in the index ? if (value != null) { //Add its data to the new list ! VariableByteArrayInput is = new VariableByteArrayInput(value.getData()); try { while (is.available() > 0) { ! int storedDocId = is.readInt(); ! int gidsCount = is.readInt(); ! //TOUNDERSTAND -pb ! int size = is.readFixedInt(); if (storedDocId != this.doc.getDocId()) { // data are related to another document: *************** *** 288,295 **** // data are related to our document: // feed the new list with the GIDs ! previousGID = 0; for (int j = 0; j < gidsCount; j++) { ! delta = is.readLong(); ! storedGID = previousGID + delta; long address = StorageAddress.read(is); // add the node to the new list if it is not --- 266,273 ---- // data are related to our document: // feed the new list with the GIDs ! long previousGID = 0; for (int j = 0; j < gidsCount; j++) { ! long delta = is.readLong(); ! long storedGID = previousGID + delta; long address = StorageAddress.read(is); // add the node to the new list if it is not *************** *** 308,322 **** //append the data from the new list if (newGIDList.size() > 0 ) { ! gidsCount = newGIDList.size(); //Don't forget this one FastQSort.sort(newGIDList, 0, gidsCount - 1); os.writeInt(this.doc.getDocId()); os.writeInt(gidsCount); ! lenOffset = os.position(); os.writeFixedInt(0); ! previousGID = 0; for (int j = 0; j < gidsCount; j++) { ! storedNode = (NodeProxy) newGIDList.get(j); ! delta = storedNode.getGID() - previousGID; os.writeLong(delta); StorageAddress.write(storedNode.getInternalAddress(), os); --- 286,301 ---- //append the data from the new list if (newGIDList.size() > 0 ) { ! int gidsCount = newGIDList.size(); //Don't forget this one FastQSort.sort(newGIDList, 0, gidsCount - 1); os.writeInt(this.doc.getDocId()); os.writeInt(gidsCount); ! //TOUNDERSTAND -pb ! int lenOffset = os.position(); os.writeFixedInt(0); ! long previousGID = 0; for (int j = 0; j < gidsCount; j++) { ! NodeProxy storedNode = (NodeProxy) newGIDList.get(j); ! long delta = storedNode.getGID() - previousGID; os.writeLong(delta); StorageAddress.write(storedNode.getInternalAddress(), os); *************** *** 375,385 **** */ //TODO : note that this is *not* this.doc -pb ! public void dropIndex(DocumentImpl document) throws ReadOnlyException { ! Value key; ! int gidsCount; ! int size; ! VariableByteInput is; ! int storedDocId; ! boolean changed; final short collectionId = document.getCollection().getId(); final Value ref = new ElementValue(collectionId); --- 354,358 ---- */ //TODO : note that this is *not* this.doc -pb ! public void dropIndex(DocumentImpl document) throws ReadOnlyException { final short collectionId = document.getCollection().getId(); final Value ref = new ElementValue(collectionId); *************** *** 390,402 **** ArrayList elements = dbNodes.findKeys(query); for (int i = 0; i < elements.size(); i++) { ! changed = false; ! key = (Value) elements.get(i); ! is = dbNodes.getAsStream(key); os.clear(); try { while (is.available() > 0) { ! storedDocId = is.readInt(); ! gidsCount = is.readInt(); ! size = is.readFixedInt(); if (storedDocId != document.getDocId()) { // data are related to another document: --- 363,376 ---- ArrayList elements = dbNodes.findKeys(query); for (int i = 0; i < elements.size(); i++) { ! boolean changed = false; ! Value key = (Value) elements.get(i); ! VariableByteInput is = dbNodes.getAsStream(key); os.clear(); try { while (is.available() > 0) { ! int storedDocId = is.readInt(); ! int gidsCount = is.readInt(); ! //TOUNDERSTAND -pb ! int size = is.readFixedInt(); if (storedDocId != document.getDocId()) { // data are related to another document: *************** *** 446,467 **** if (pending.size() == 0) return; ! QName qname; ! NodeProxy storedNode; ! //TODO : NativeValueIndex uses LongLinkedLists -pb ! List newGIDList; ! List storedGIDList; ! int gidsCount; ! long storedGID; ! long previousGID; ! long delta; ! Value ref; ! Map.Entry entry; final SymbolTable symbols = broker.getSymbols(); - VariableByteInput is; - //TOUNDERSTAND -pb - int size; - int lenOffset; - int storedDocId; - long address; final short collectionId = document.getCollection().getId(); final Lock lock = dbNodes.getLock(); --- 420,434 ---- if (pending.size() == 0) return; ! ! ! ! ! ! ! ! ! ! final SymbolTable symbols = broker.getSymbols(); final short collectionId = document.getCollection().getId(); final Lock lock = dbNodes.getLock(); *************** *** 469,476 **** try { lock.acquire(Lock.WRITE_LOCK); ! entry = (Map.Entry) i.next(); ! storedGIDList = (ArrayList) entry.getValue(); ! qname = (QName) entry.getKey(); //Compute a key for the node if (qname.getNameType() == ElementValue.ATTRIBUTE_ID) { ref = new ElementValue(qname.getNameType(), collectionId, qname.getLocalName()); --- 436,448 ---- try { lock.acquire(Lock.WRITE_LOCK); ! os.clear(); ! //TODO : NativeValueIndex uses LongLinkedLists -pb ! List newGIDList = new ArrayList(); //Compute a key for the node + Map.Entry entry = (Map.Entry) i.next(); + //TODO : NativeValueIndex uses LongLinkedLists -pb + List storedGIDList = (ArrayList) entry.getValue(); + QName qname = (QName) entry.getKey(); + Value ref; if (qname.getNameType() == ElementValue.ATTRIBUTE_ID) { ref = new ElementValue(qname.getNameType(), collectionId, qname.getLocalName()); *************** *** 480,486 **** ref = new ElementValue(qname.getNameType(), collectionId, sym, nsSym); } ! is = dbNodes.getAsStream(ref); ! os.clear(); ! newGIDList = new ArrayList(); //Does the node already exist in the index ? if (is != null) { --- 452,456 ---- ref = new ElementValue(qname.getNameType(), collectionId, sym, nsSym); } ! VariableByteInput is = dbNodes.getAsStream(ref); //Does the node already exist in the index ? if (is != null) { *************** *** 488,494 **** try { while (is.available() > 0) { ! storedDocId = is.readInt(); ! gidsCount = is.readInt(); ! size = is.readFixedInt(); if (storedDocId != document.getDocId()) { // data are related to another document: --- 458,465 ---- try { while (is.available() > 0) { ! int storedDocId = is.readInt(); ! int gidsCount = is.readInt(); ! //TOUNDERSTAND -pb ! int size = is.readFixedInt(); if (storedDocId != document.getDocId()) { // data are related to another document: *************** *** 501,509 **** // data are related to our document: // feed the new list with the GIDs ! previousGID = 0; for (int j = 0; j < gidsCount; j++) { ! delta = is.readLong(); ! storedGID = previousGID + delta; ! address = StorageAddress.read(is); if (node == null) { if (document.getTreeLevel(storedGID) < document.reindexRequired()) { --- 472,480 ---- // data are related to our document: // feed the new list with the GIDs ! long previousGID = 0; for (int j = 0; j < gidsCount; j++) { ! long delta = is.readLong(); ! long storedGID = previousGID + delta; ! long address = StorageAddress.read(is); if (node == null) { if (document.getTreeLevel(storedGID) < document.reindexRequired()) { *************** *** 526,540 **** // append the new list to any existing data if (node != null) storedGIDList.addAll(newGIDList); ! gidsCount = storedGIDList.size(); //Don't forget this one FastQSort.sort(storedGIDList, 0, gidsCount - 1); os.writeInt(document.getDocId()); os.writeInt(gidsCount); ! lenOffset = os.position(); os.writeFixedInt(0); ! previousGID = 0; for (int j = 0; j < gidsCount; j++) { ! storedNode = (NodeProxy) storedGIDList.get(j); ! delta = storedNode.getGID() - previousGID; os.writeLong(delta); StorageAddress.write(storedNode.getInternalAddress(), os); --- 497,512 ---- // append the new list to any existing data if (node != null) storedGIDList.addAll(newGIDList); ! int gidsCount = storedGIDList.size(); //Don't forget this one FastQSort.sort(storedGIDList, 0, gidsCount - 1); os.writeInt(document.getDocId()); os.writeInt(gidsCount); ! //TOUNDERSTAND -pb ! int lenOffset = os.position(); os.writeFixedInt(0); ! long previousGID = 0; for (int j = 0; j < gidsCount; j++) { ! NodeProxy storedNode = (NodeProxy) storedGIDList.get(j); ! long delta = storedNode.getGID() - previousGID; os.writeLong(delta); StorageAddress.write(storedNode.getInternalAddress(), os); *************** *** 549,553 **** } } else { ! address = ((BFile.PageInputStream) is).getAddress(); if (dbNodes.update(address, ref, os.data()) == BFile.UNKNOWN_ADDRESS) { LOG.warn("Could not update index data for node '" + qname + "'"); --- 521,525 ---- } } else { ! long address = ((BFile.PageInputStream) is).getAddress(); if (dbNodes.update(address, ref, os.data()) == BFile.UNKNOWN_ADDRESS) { LOG.warn("Could not update index data for node '" + qname + "'"); *************** *** 559,564 **** return; } catch (IOException e) { ! LOG.error(e.getMessage(), e); ! is = null; //TODO : return ? } catch (ReadOnlyException e) { --- 531,535 ---- return; } catch (IOException e) { ! LOG.error(e.getMessage(), e); //TODO : return ? } catch (ReadOnlyException e) { *************** *** 603,624 **** final ExtArrayNodeSet result = new ExtArrayNodeSet(docs.getLength(), 256); final SymbolTable symbols = broker.getSymbols(); - ElementValue ref; - int storedDocId; - int gidsCount; - //TOUNDERSTAND -pb - int size; - long storedGID; - long previousGID; - long delta; - NodeProxy storedNode; - VariableByteInput is; - Collection collection; - short collectionId; - DocumentImpl storedDocument; final Lock lock = dbNodes.getLock(); for (Iterator i = docs.getCollectionIterator(); i.hasNext();) { - collection = (Collection) i.next(); - collectionId = collection.getId(); //Compute a key for the node if (type == ElementValue.ATTRIBUTE_ID) { ref = new ElementValue((byte) type, collectionId, qname.getLocalName()); --- 574,583 ---- final ExtArrayNodeSet result = new ExtArrayNodeSet(docs.getLength(), 256); final SymbolTable symbols = broker.getSymbols(); final Lock lock = dbNodes.getLock(); for (Iterator i = docs.getCollectionIterator(); i.hasNext();) { //Compute a key for the node + Collection collection = (Collection) i.next(); + short collectionId = collection.getId(); + ElementValue ref; if (type == ElementValue.ATTRIBUTE_ID) { ref = new ElementValue((byte) type, collectionId, qname.getLocalName()); *************** *** 630,643 **** try { lock.acquire(Lock.READ_LOCK); ! is = dbNodes.getAsStream(ref); //Does the node already has data in the index ? if (is == null) ! continue; ! while (is.available() > 0) { ! storedDocId = is.readInt(); ! gidsCount = is.readInt(); ! size = is.readFixedInt(); ! storedDocument = docs.getDoc(storedDocId); //Exit if the document is not concerned if (storedDocument == null) { --- 589,602 ---- try { lock.acquire(Lock.READ_LOCK); ! VariableByteInput is = dbNodes.getAsStream(ref); //Does the node already has data in the index ? if (is == null) ! continue; while (is.available() > 0) { ! int storedDocId = is.readInt(); ! int gidsCount = is.readInt(); ! //TOUNDERSTAND -pb ! int size = is.readFixedInt(); ! DocumentImpl storedDocument = docs.getDoc(storedDocId); //Exit if the document is not concerned if (storedDocument == null) { *************** *** 646,659 **** } //Process the nodes ! previousGID = 0; for (int k = 0; k < gidsCount; k++) { ! delta = is.readLong(); ! storedGID = previousGID + delta; if (selector == null) { ! storedNode = new NodeProxy(storedDocument, storedGID, nodeType, StorageAddress.read(is)); result.add(storedNode, gidsCount); } else { //Filter out the node if requested to do so ! storedNode = selector.match(storedDocument, storedGID); if (storedNode == null) { is.skip(3); --- 605,618 ---- } //Process the nodes ! long previousGID = 0; for (int k = 0; k < gidsCount; k++) { ! long delta = is.readLong(); ! long storedGID = previousGID + delta; if (selector == null) { ! NodeProxy storedNode = new NodeProxy(storedDocument, storedGID, nodeType, StorageAddress.read(is)); result.add(storedNode, gidsCount); } else { //Filter out the node if requested to do so ! NodeProxy storedNode = selector.match(storedDocument, storedGID); if (storedNode == null) { is.skip(3); *************** *** 672,677 **** LOG.warn("Failed to acquire lock for '" + dbNodes.getFile().getName() + "'", e); } catch (IOException e) { ! LOG.error(e.getMessage(), e); ! is = null; //TODO : return ? } finally { --- 631,635 ---- LOG.warn("Failed to acquire lock for '" + dbNodes.getFile().getName() + "'", e); } catch (IOException e) { ! LOG.error(e.getMessage(), e); //TODO : return ? } finally { *************** *** 692,703 **** else collections = new ArrayList(); ! collections.add(collection); ! QName qname; ! int gidsCount; ! //TOUNDERSTAND -pb ! int size; final SymbolTable symbols = broker.getSymbols(); ! VariableByteArrayInput is; ! TreeMap map = new TreeMap(); final Lock lock = dbNodes.getLock(); for (Iterator i = collections.iterator(); i.hasNext();) { --- 650,656 ---- else collections = new ArrayList(); ! collections.add(collection); final SymbolTable symbols = broker.getSymbols(); ! final TreeMap map = new TreeMap(); final Lock lock = dbNodes.getLock(); for (Iterator i = collections.iterator(); i.hasNext();) { *************** *** 722,726 **** namespace = symbols.getNamespace(nsSymbol); } ! qname = new QName(name, namespace); Occurrences oc = (Occurrences) map.get(qname); if (oc == null) { --- 675,679 ---- namespace = symbols.getNamespace(nsSymbol); } ! QName qname = new QName(name, namespace); Occurrences oc = (Occurrences) map.get(qname); if (oc == null) { *************** *** 731,740 **** map.put(qname, oc); } ! is = new VariableByteArrayInput(val[1].data(), val[1].start(), val[1].getLength()); try { while (is.available() > 0) { is.readInt(); ! gidsCount = is.readInt(); ! size = is.readFixedInt(); is.skipBytes(size); oc.addOccurrences(gidsCount); --- 684,694 ---- map.put(qname, oc); } ! VariableByteArrayInput is = new VariableByteArrayInput(val[1].data(), val[1].start(), val[1].getLength()); try { while (is.available() > 0) { is.readInt(); ! int gidsCount = is.readInt(); ! //TOUNDERSTAND -pb ! int size = is.readFixedInt(); is.skipBytes(size); oc.addOccurrences(gidsCount); *************** *** 765,781 **** //TODO : note that this is *not* this.doc -pb public void consistencyCheck(DocumentImpl document) throws EXistException { ! final SymbolTable symbols = broker.getSymbols(); ! Node storedNode; ! int storedDocId; ! int gidsCount; ! long storedGID; ! long previousGID; ! long delta; ! long address; ! String nodeName; ! StringBuffer msg = new StringBuffer(); final short collectionId = document.getCollection().getId(); final Value ref = new ElementValue(collectionId); final IndexQuery query = new IndexQuery(IndexQuery.TRUNC_RIGHT, ref); final Lock lock = dbNodes.getLock(); try { --- 719,727 ---- //TODO : note that this is *not* this.doc -pb public void consistencyCheck(DocumentImpl document) throws EXistException { ! final SymbolTable symbols = broker.getSymbols(); final short collectionId = document.getCollection().getId(); final Value ref = new ElementValue(collectionId); final IndexQuery query = new IndexQuery(IndexQuery.TRUNC_RIGHT, ref); + final StringBuffer msg = new StringBuffer(); final Lock lock = dbNodes.getLock(); try { *************** *** 787,791 **** Value value = dbNodes.get(key); short sym = ByteConversion.byteToShort(key.data(), key.start() + 3); ! nodeName = symbols.getName(sym); msg.setLength(0); msg.append("Checking ").append(nodeName).append(": "); --- 733,737 ---- Value value = dbNodes.get(key); short sym = ByteConversion.byteToShort(key.data(), key.start() + 3); ! String nodeName = symbols.getName(sym); msg.setLength(0); msg.append("Checking ").append(nodeName).append(": "); *************** *** 793,798 **** try { while (is.available() > 0) { ! storedDocId = is.readInt(); ! gidsCount = is.readInt(); is.readFixedInt(); if (storedDocId != document.getDocId()) { --- 739,745 ---- try { while (is.available() > 0) { ! int storedDocId = is.readInt(); ! int gidsCount = is.readInt(); ! //TODO : use variable -pb is.readFixedInt(); if (storedDocId != document.getDocId()) { *************** *** 803,812 **** // data are related to our document: // check ! previousGID = 0; for (int j = 0; j < gidsCount; j++) { ! delta = is.readLong(); ! storedGID = previousGID + delta; ! address = StorageAddress.read(is); ! storedNode = broker.objectWith(new NodeProxy(doc, storedGID, address)); if (storedNode == null) { throw new EXistException("Node " + storedGID + " in document " + document.getFileName() + " not found."); --- 750,759 ---- // data are related to our document: // check ! long previousGID = 0; for (int j = 0; j < gidsCount; j++) { ! long delta = is.readLong(); ! long storedGID = previousGID + delta; ! long address = StorageAddress.read(is); ! Node storedNode = broker.objectWith(new NodeProxy(doc, storedGID, address)); if (storedNode == null) { throw new EXistException("Node " + storedGID + " in document " + document.getFileName() + " not found."); |
|
From: Wolfgang M. M. <wol...@us...> - 2005-12-31 10:24:55
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/xquery/value In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv14458/src/org/exist/xquery/value Modified Files: OrderedValueSequence.java Log Message: Optimize child:: axis selection: use a top-down approach if the context sequence has less than 50 items. For small sequences, traversing the node tree top-down is faster than the usual bottom-up approach used by eXist. Index: OrderedValueSequence.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/value/OrderedValueSequence.java,v retrieving revision 1.7 retrieving revision 1.8 diff -C2 -d -r1.7 -r1.8 *** OrderedValueSequence.java 14 Nov 2004 22:15:13 -0000 1.7 --- OrderedValueSequence.java 31 Dec 2005 10:24:44 -0000 1.8 *************** *** 45,50 **** private int count = 0; - private long execTime = 0; - public OrderedValueSequence(OrderSpec orderSpecs[], int size) { this.orderSpecs = orderSpecs; --- 45,48 ---- *************** *** 141,145 **** public Entry(Item item) throws XPathException { - long start = System.currentTimeMillis(); this.item = item; values = new AtomicValue[orderSpecs.length]; --- 139,142 ---- *************** *** 154,158 **** " ; found: " + seq.getLength()); } - execTime = execTime + (System.currentTimeMillis() - start); } --- 151,154 ---- |
|
From: Pierrick B. <br...@us...> - 2005-12-31 10:01:09
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/xquery/value In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv10834/src/org/exist/xquery/value Modified Files: PreorderedValueSequence.java Log Message: Use of named constants Index: PreorderedValueSequence.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/value/PreorderedValueSequence.java,v retrieving revision 1.4 retrieving revision 1.5 diff -C2 -d -r1.4 -r1.5 *** PreorderedValueSequence.java 16 Nov 2004 15:47:06 -0000 1.4 --- PreorderedValueSequence.java 31 Dec 2005 10:01:00 -0000 1.5 *************** *** 29,32 **** --- 29,33 ---- import org.exist.dom.NodeProxy; import org.exist.dom.NodeSet; + import org.exist.xquery.Constants; import org.exist.xquery.Expression; import org.exist.xquery.OrderSpec; *************** *** 157,173 **** if(a == AtomicValue.EMPTY_VALUE && b != AtomicValue.EMPTY_VALUE) { if((orderSpecs[i].getModifiers() & OrderSpec.EMPTY_LEAST) != 0) ! cmp = -1; else ! cmp = 1; ! } else if(b == AtomicValue.EMPTY_VALUE && a != AtomicValue.EMPTY_VALUE) { if((orderSpecs[i].getModifiers() & OrderSpec.EMPTY_LEAST) != 0) ! cmp = 1; else ! cmp = -1; } else cmp = a.compareTo(orderSpecs[i].getCollator(), b); if((orderSpecs[i].getModifiers() & OrderSpec.DESCENDING_ORDER) != 0) cmp = cmp * -1; ! if(cmp != 0) break; } catch (XPathException e) { --- 158,174 ---- if(a == AtomicValue.EMPTY_VALUE && b != AtomicValue.EMPTY_VALUE) { if((orderSpecs[i].getModifiers() & OrderSpec.EMPTY_LEAST) != 0) ! cmp = Constants.INFERIOR; else ! cmp = Constants.SUPERIOR; ! } else if(a != AtomicValue.EMPTY_VALUE && b == AtomicValue.EMPTY_VALUE) { if((orderSpecs[i].getModifiers() & OrderSpec.EMPTY_LEAST) != 0) ! cmp = Constants.SUPERIOR; else ! cmp = Constants.INFERIOR; } else cmp = a.compareTo(orderSpecs[i].getCollator(), b); if((orderSpecs[i].getModifiers() & OrderSpec.DESCENDING_ORDER) != 0) cmp = cmp * -1; ! if(cmp != Constants.EQUAL) break; } catch (XPathException e) { |
|
From: Pierrick B. <br...@us...> - 2005-12-31 10:00:40
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/storage In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv10801/src/org/exist/storage Modified Files: NativeValueIndex.java Log Message: Refactored local variables Index: NativeValueIndex.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/storage/NativeValueIndex.java,v retrieving revision 1.58 retrieving revision 1.59 diff -C2 -d -r1.58 -r1.59 *** NativeValueIndex.java 31 Dec 2005 09:07:32 -0000 1.58 --- NativeValueIndex.java 31 Dec 2005 10:00:30 -0000 1.59 *************** *** 221,242 **** //TODO : return if doc == null? -pb if (pending.size() == 0) ! return; ! Indexable indexable; ! //TODO : NativeElementIndex uses ArrayLists -pb ! LongLinkedList gidList; ! long gids[]; ! int gidsCount; ! long previousGID; ! long delta; ! Value ref; ! Map.Entry entry; final short collectionId = this.doc.getCollection().getId(); final Lock lock = dbValues.getLock(); for (Iterator i = pending.entrySet().iterator(); i.hasNext();) { ! entry = (Map.Entry) i.next(); ! indexable = (Indexable) entry.getKey(); ! gidList = (LongLinkedList) entry.getValue(); ! gids = gidList.getData(); ! gidsCount = gids.length; //Don't forget this one Arrays.sort(gids); --- 221,234 ---- //TODO : return if doc == null? -pb if (pending.size() == 0) ! return; final short collectionId = this.doc.getCollection().getId(); final Lock lock = dbValues.getLock(); for (Iterator i = pending.entrySet().iterator(); i.hasNext();) { ! Map.Entry entry = (Map.Entry) i.next(); ! Indexable indexable = (Indexable) entry.getKey(); ! //TODO : NativeElementIndex uses ArrayLists -pb ! LongLinkedList gidList = (LongLinkedList) entry.getValue(); ! long[] gids = gidList.getData(); ! int gidsCount = gids.length; //Don't forget this one Arrays.sort(gids); *************** *** 245,256 **** os.writeInt(gidsCount); //Compute the GID list ! previousGID = 0; for (int j = 0; j < gidsCount; j++) { ! delta = gids[j] - previousGID; os.writeLong(delta); previousGID = gids[j]; } //Compute a key for the value ! ref = new Value(indexable.serialize(collectionId, caseSensitive)); try { lock.acquire(Lock.WRITE_LOCK); --- 237,248 ---- os.writeInt(gidsCount); //Compute the GID list ! long previousGID = 0; for (int j = 0; j < gidsCount; j++) { ! long delta = gids[j] - previousGID; os.writeLong(delta); previousGID = gids[j]; } //Compute a key for the value ! Value ref = new Value(indexable.serialize(collectionId, caseSensitive)); try { lock.acquire(Lock.WRITE_LOCK); *************** *** 282,321 **** //TODO : return if doc == null? -pb if (pending.size() == 0) ! return; ! Indexable indexable; ! //TODO : NativeElementIndex uses ArrayLists -pb ! LongLinkedList storedGIDList; ! LongLinkedList newGIDList; ! long[] gids; ! int gidsCount; ! long storedGID; ! long previousGID; ! long delta; ! Map.Entry entry; ! Value searchKey; ! Value value; ! VariableByteArrayInput is; ! int storedDocId; final short collectionId = this.doc.getCollection().getId(); final Lock lock = dbValues.getLock(); for (Iterator i = pending.entrySet().iterator(); i.hasNext();) { ! entry = (Map.Entry) i.next(); ! indexable = (Indexable) entry.getKey(); ! storedGIDList = (LongLinkedList) entry.getValue(); //Compute a key for the value ! searchKey = new Value(indexable.serialize(collectionId, caseSensitive)); ! newGIDList = new LongLinkedList(); os.clear(); try { lock.acquire(Lock.WRITE_LOCK); ! value = dbValues.get(searchKey); //Does the value already exist in the index ? if (value != null) { //Add its data to the new list ! is = new VariableByteArrayInput(value.getData()); try { while (is.available() > 0) { ! storedDocId = is.readInt(); ! gidsCount = is.readInt(); if (storedDocId != this.doc.getDocId()) { // data are related to another document: --- 274,301 ---- //TODO : return if doc == null? -pb if (pending.size() == 0) ! return; final short collectionId = this.doc.getCollection().getId(); final Lock lock = dbValues.getLock(); for (Iterator i = pending.entrySet().iterator(); i.hasNext();) { ! Map.Entry entry = (Map.Entry) i.next(); ! Indexable indexable = (Indexable) entry.getKey(); ! //TODO : NativeElementIndex uses ArrayLists -pb ! LongLinkedList storedGIDList = (LongLinkedList) entry.getValue(); //Compute a key for the value ! Value searchKey = new Value(indexable.serialize(collectionId, caseSensitive)); ! //TODO : NativeElementIndex uses ArrayLists -pb ! LongLinkedList newGIDList = new LongLinkedList(); os.clear(); try { lock.acquire(Lock.WRITE_LOCK); ! Value value = dbValues.get(searchKey); //Does the value already exist in the index ? if (value != null) { //Add its data to the new list ! VariableByteArrayInput is = new VariableByteArrayInput(value.getData()); try { while (is.available() > 0) { ! int storedDocId = is.readInt(); ! int gidsCount = is.readInt(); if (storedDocId != this.doc.getDocId()) { // data are related to another document: *************** *** 327,334 **** // data are related to our document: // feed the new list with the GIDs ! previousGID = 0; for (int j = 0; j < gidsCount; j++) { ! delta = is.readLong(); ! storedGID = previousGID + delta; // add the node to the new list if it is not // in the list of removed nodes --- 307,314 ---- // data are related to our document: // feed the new list with the GIDs ! long previousGID = 0; for (int j = 0; j < gidsCount; j++) { ! long delta = is.readLong(); ! long storedGID = previousGID + delta; // add the node to the new list if it is not // in the list of removed nodes *************** *** 346,358 **** //append the data from the new list if (newGIDList.getSize() > 0) { ! gids = newGIDList.getData(); ! gidsCount = gids.length; //Don't forget this one Arrays.sort(gids); os.writeInt(this.doc.getDocId()); os.writeInt(gidsCount); ! previousGID = 0; for (int j = 0; j < gidsCount; j++) { ! delta = gids[j] - previousGID; os.writeLong(delta); previousGID = gids[j]; --- 326,338 ---- //append the data from the new list if (newGIDList.getSize() > 0) { ! long[] gids = newGIDList.getData(); ! int gidsCount = gids.length; //Don't forget this one Arrays.sort(gids); os.writeInt(this.doc.getDocId()); os.writeInt(gidsCount); ! long previousGID = 0; for (int j = 0; j < gidsCount; j++) { ! long delta = gids[j] - previousGID; os.writeLong(delta); previousGID = gids[j]; *************** *** 388,394 **** */ public void dropIndex(Collection collection) { ! Value ref = new ElementValue(collection.getId()); ! IndexQuery query = new IndexQuery(IndexQuery.TRUNC_RIGHT, ref); ! Lock lock = dbValues.getLock(); try { lock.acquire(Lock.WRITE_LOCK); --- 368,374 ---- */ public void dropIndex(Collection collection) { ! final Value ref = new ElementValue(collection.getId()); ! final IndexQuery query = new IndexQuery(IndexQuery.TRUNC_RIGHT, ref); ! final Lock lock = dbValues.getLock(); try { lock.acquire(Lock.WRITE_LOCK); *************** *** 410,421 **** */ //TODO : note that this is *not* this.doc -pb ! public void dropIndex(DocumentImpl document) throws ReadOnlyException { ! Value key; ! Value value; ! int gidsCount; ! long delta; ! VariableByteArrayInput is; ! int storedDocId; ! boolean changed; final short collectionId = document.getCollection().getId(); final Value ref = new ElementValue(collectionId); --- 390,394 ---- */ //TODO : note that this is *not* this.doc -pb ! public void dropIndex(DocumentImpl document) throws ReadOnlyException { final short collectionId = document.getCollection().getId(); final Value ref = new ElementValue(collectionId); *************** *** 426,437 **** ArrayList elements = dbValues.findKeys(query); for (int i = 0; i < elements.size(); i++) { ! changed = false; ! key = (Value) elements.get(i); ! value = dbValues.get(key); ! is = new VariableByteArrayInput(value.getData()); os.clear(); while (is.available() > 0) { ! storedDocId = is.readInt(); ! gidsCount = is.readInt(); if (storedDocId != document.getDocId()) { // data are related to another document: --- 399,410 ---- ArrayList elements = dbValues.findKeys(query); for (int i = 0; i < elements.size(); i++) { ! boolean changed = false; ! Value key = (Value) elements.get(i); ! Value value = dbValues.get(key); ! VariableByteArrayInput is = new VariableByteArrayInput(value.getData()); os.clear(); while (is.available() > 0) { ! int storedDocId = is.readInt(); ! int gidsCount = is.readInt(); if (storedDocId != document.getDocId()) { // data are related to another document: *************** *** 440,444 **** os.writeInt(gidsCount); for (int j = 0; j < gidsCount; j++) { ! delta = is.readLong(); os.writeLong(delta); } --- 413,417 ---- os.writeInt(gidsCount); for (int j = 0; j < gidsCount; j++) { ! long delta = is.readLong(); os.writeLong(delta); } *************** *** 482,518 **** if (pending.size() == 0) return; - Indexable indexable; - //TODO : NativeElementIndex uses ArrayLists -pb - LongLinkedList storedGIDList; - LongLinkedList newGIDList; - long[] gids; - int gidsCount; - long storedGID; - long previousGID; - long delta; - Value ref; - Map.Entry entry; - VariableByteInput is; - int storedDocId; - long address; final short collectionId = document.getCollection().getId(); final Lock lock = dbValues.getLock(); for (Iterator i = pending.entrySet().iterator(); i.hasNext();) { //Compute a key for the value ! entry = (Map.Entry) i.next(); ! indexable = (Indexable) entry.getKey(); ! storedGIDList = (LongLinkedList) entry.getValue(); ! ref = new Value(indexable.serialize(collectionId, caseSensitive)); try { lock.acquire(Lock.WRITE_LOCK); ! is = dbValues.getAsStream(ref); os.clear(); ! newGIDList = new LongLinkedList(); //Does the value already has data in the index ? if (is != null) { try { while (is.available() > 0) { ! storedDocId = is.readInt(); ! gidsCount = is.readInt(); if (storedDocId != document.getDocId()) { // data are related to another document: --- 455,479 ---- if (pending.size() == 0) return; final short collectionId = document.getCollection().getId(); final Lock lock = dbValues.getLock(); for (Iterator i = pending.entrySet().iterator(); i.hasNext();) { //Compute a key for the value ! Map.Entry entry = (Map.Entry) i.next(); ! Indexable indexable = (Indexable) entry.getKey(); ! //TODO : NativeElementIndex uses ArrayLists -pb ! LongLinkedList storedGIDList = (LongLinkedList) entry.getValue(); ! Value ref = new Value(indexable.serialize(collectionId, caseSensitive)); try { lock.acquire(Lock.WRITE_LOCK); ! VariableByteInput is = dbValues.getAsStream(ref); os.clear(); ! //TODO : NativeElementIndex uses ArrayLists -pb ! LongLinkedList newGIDList = new LongLinkedList(); //Does the value already has data in the index ? if (is != null) { try { while (is.available() > 0) { ! int storedDocId = is.readInt(); ! int gidsCount = is.readInt(); if (storedDocId != document.getDocId()) { // data are related to another document: *************** *** 524,531 **** // data are related to our document: // feed the new list with the GIDs ! previousGID = 0; for (int j = 0; j < gidsCount; j++) { ! delta = is.readLong(); ! storedGID = previousGID + delta; if (node == null) { if (document.getTreeLevel(storedGID) < document.reindexRequired()) --- 485,492 ---- // data are related to our document: // feed the new list with the GIDs ! long previousGID = 0; for (int j = 0; j < gidsCount; j++) { ! long delta = is.readLong(); ! long storedGID = previousGID + delta; if (node == null) { if (document.getTreeLevel(storedGID) < document.reindexRequired()) *************** *** 545,560 **** } } ! // append the new list to any existing data ! gids = storedGIDList.getData(); ! gidsCount = gids.length; ! //Don't forget this one ! Arrays.sort(gids); ! os.writeInt(document.getDocId()); ! os.writeInt(gidsCount); ! previousGID = 0; ! for (int j = 0; j < gidsCount; j++) { ! delta = gids[j] - previousGID; ! os.writeLong(delta); ! previousGID = gids[j]; } //Store the data --- 506,523 ---- } } ! if (storedGIDList.getSize() > 0) { ! // append the new list to any existing data ! long[] gids = storedGIDList.getData(); ! int gidsCount = gids.length; ! //Don't forget this one ! Arrays.sort(gids); ! os.writeInt(document.getDocId()); ! os.writeInt(gidsCount); ! long previousGID = 0; ! for (int j = 0; j < gidsCount; j++) { ! long delta = gids[j] - previousGID; ! os.writeLong(delta); ! previousGID = gids[j]; ! } } //Store the data *************** *** 565,569 **** } } else { ! address = ((BFile.PageInputStream) is).getAddress(); if (dbValues.update(address, ref, os.data()) == BFile.UNKNOWN_ADDRESS) { LOG.warn("Could not update index data for value '" + ref + "'"); --- 528,532 ---- } } else { ! long address = ((BFile.PageInputStream) is).getAddress(); if (dbValues.update(address, ref, os.data()) == BFile.UNKNOWN_ADDRESS) { LOG.warn("Could not update index data for value '" + ref + "'"); *************** *** 575,580 **** return; } catch (IOException e) { ! LOG.error(e.getMessage(), e); ! is = null; //TODO : return ? } catch (ReadOnlyException e) { --- 538,542 ---- return; } catch (IOException e) { ! LOG.error(e.getMessage(), e); //TODO : return ? } catch (ReadOnlyException e) { *************** *** 647,653 **** } ! TermMatcher comparator = new RegexMatcher(expr, type, flags); ! NodeSet result = new ExtArrayNodeSet(); ! RegexCallback callback = new RegexCallback(docs, contextSet, result, comparator); final Lock lock = dbValues.getLock(); for (Iterator iter = docs.getCollectionIterator(); iter.hasNext();) { --- 609,615 ---- } ! final TermMatcher comparator = new RegexMatcher(expr, type, flags); ! final NodeSet result = new ExtArrayNodeSet(); ! final RegexCallback callback = new RegexCallback(docs, contextSet, result, comparator); final Lock lock = dbValues.getLock(); for (Iterator iter = docs.getCollectionIterator(); iter.hasNext();) { *************** *** 823,840 **** LOG.error(e.getMessage(), e); return true; ! } ! ! int storedDocId; ! int gidsCount; ! long storedGID; ! long delta; ! DocumentImpl storedDocument; ! NodeProxy storedNode, parentNode; try { int sizeHint = -1; while (is.available() > 0) { ! storedDocId = is.readInt(); ! gidsCount = is.readInt(); ! storedDocument = docs.getDoc(storedDocId); //Exit if the document is not concerned if (storedDocument == null) { --- 785,795 ---- LOG.error(e.getMessage(), e); return true; ! } try { int sizeHint = -1; while (is.available() > 0) { ! int storedDocId = is.readInt(); ! int gidsCount = is.readInt(); ! DocumentImpl storedDocument = docs.getDoc(storedDocId); //Exit if the document is not concerned if (storedDocument == null) { *************** *** 850,858 **** } //Process the nodes ! storedGID = 0; for (int j = 0; j < gidsCount; j++) { ! delta = is.readLong(); storedGID = storedGID + delta; ! storedNode = new NodeProxy(storedDocument, storedGID); // if a context set is specified, we can directly check if the // matching node is a descendant of one of the nodes --- 805,813 ---- } //Process the nodes ! long storedGID = 0; for (int j = 0; j < gidsCount; j++) { ! long delta = is.readLong(); storedGID = storedGID + delta; ! NodeProxy storedNode = new NodeProxy(storedDocument, storedGID); // if a context set is specified, we can directly check if the // matching node is a descendant of one of the nodes *************** *** 861,865 **** sizeHint = contextSet.getSizeHint(storedDocument); if (returnAncestor) { ! parentNode = contextSet.parentWithChild(storedNode, false, true, NodeProxy.UNKNOWN_NODE_LEVEL); if (parentNode != null) result.add(parentNode, sizeHint); --- 816,820 ---- sizeHint = contextSet.getSizeHint(storedDocument); if (returnAncestor) { ! NodeProxy parentNode = contextSet.parentWithChild(storedNode, false, true, NodeProxy.UNKNOWN_NODE_LEVEL); if (parentNode != null) result.add(parentNode, sizeHint); *************** *** 921,926 **** * @see org.dbxml.core.filer.BTreeCallback#indexInfo(org.dbxml.core.data.Value, long) */ ! public boolean indexInfo(Value key, long pointer) throws TerminatedException { ! AtomicValue atomic; try { --- 876,880 ---- * @see org.dbxml.core.filer.BTreeCallback#indexInfo(org.dbxml.core.data.Value, long) */ ! public boolean indexInfo(Value key, long pointer) throws TerminatedException { AtomicValue atomic; try { *************** *** 939,957 **** LOG.error(e.getMessage(), e); return true; ! } ! int storedDocId; ! int gidsCount; ! long storedGID; ! long delta; ! DocumentImpl storedDocument; ! boolean docAdded; ! ValueOccurrences oc = (ValueOccurrences) map.get(atomic); try { while (is.available() > 0) { ! docAdded = false; ! storedDocId = is.readInt(); ! gidsCount = is.readInt(); ! storedDocument = docs.getDoc(storedDocId); //Exit if the document is not concerned if (storedDocument == null) { --- 893,905 ---- LOG.error(e.getMessage(), e); return true; ! } ! ValueOccurrences oc = (ValueOccurrences) map.get(atomic); try { while (is.available() > 0) { ! boolean docAdded = false; ! int storedDocId = is.readInt(); ! int gidsCount = is.readInt(); ! DocumentImpl storedDocument = docs.getDoc(storedDocId); //Exit if the document is not concerned if (storedDocument == null) { *************** *** 959,982 **** continue; } ! storedGID = 0; for (int j = 0; j < gidsCount; j++) { ! delta = is.readLong(); ! storedGID = storedGID + delta; if (contextSet != null) { ! if (contextSet.parentWithChild(storedDocument, storedGID, false, true) != null) { if (oc == null) { oc = new ValueOccurrences(atomic); map.put(atomic, oc); } ! if (!docAdded) { oc.addDocument(storedDocument); docAdded = true; ! } ! oc.addOccurrences(1); } } //TODO : what if contextSet == null ? -pb //See above where we have this behaviour : ! //otherwise, we add all nodes without check } } --- 907,932 ---- continue; } ! long previousGID = 0; for (int j = 0; j < gidsCount; j++) { ! long delta = is.readLong(); ! long storedGID = previousGID + delta; if (contextSet != null) { ! NodeProxy parentNode = contextSet.parentWithChild(storedDocument, storedGID, false, true); ! if (parentNode != null) { if (oc == null) { oc = new ValueOccurrences(atomic); map.put(atomic, oc); } ! oc.addOccurrences(1); ! if (!docAdded) { oc.addDocument(storedDocument); docAdded = true; ! } } } //TODO : what if contextSet == null ? -pb //See above where we have this behaviour : ! //otherwise, we add all nodes without check ! previousGID = storedGID; } } |
|
From: Wolfgang M. M. <wol...@us...> - 2005-12-31 09:51:07
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/xquery In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv8863/src/org/exist/xquery Modified Files: LocationStep.java Log Message: Optimize child:: axis selection: use a top-down approach if the context sequence has less than 50 items. For small sequences, traversing the node tree top-down is faster than the usual bottom-up approach used by eXist. Index: LocationStep.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/LocationStep.java,v retrieving revision 1.66 retrieving revision 1.67 diff -C2 -d -r1.66 -r1.67 *** LocationStep.java 30 Dec 2005 20:07:18 -0000 1.66 --- LocationStep.java 31 Dec 2005 09:50:59 -0000 1.67 *************** *** 314,320 **** // if there's just a single known node in the context, it is faster // do directly search for the attribute in the parent node. ! } else if( ! axis == Constants.ATTRIBUTE_AXIS && contextSet.getLength() < ATTR_DIRECT_SELECT_THRESHOLD ! && !(contextSet instanceof VirtualNodeSet)) { NodeProxy proxy = contextSet.get(0); if (proxy != null && proxy.getInternalAddress() != NodeProxy.UNKNOWN_NODE_ADDRESS) --- 314,320 ---- // if there's just a single known node in the context, it is faster // do directly search for the attribute in the parent node. ! } else if(!(contextSet instanceof VirtualNodeSet) && ! axis == Constants.ATTRIBUTE_AXIS && ! contextSet.getLength() < ATTR_DIRECT_SELECT_THRESHOLD) { NodeProxy proxy = contextSet.get(0); if (proxy != null && proxy.getInternalAddress() != NodeProxy.UNKNOWN_NODE_ADDRESS) |
|
From: Wolfgang M. M. <wol...@us...> - 2005-12-31 09:50:06
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/dom In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv8597/src/org/exist/dom Modified Files: NodeProxy.java ExtArrayNodeSet.java NodeSet.java AbstractNodeSet.java ArraySet.java Log Message: Optimize child:: axis selection: use a top-down approach if the context sequence has less than 50 items. For small sequences, traversing the node tree top-down is faster than the usual bottom-up approach used by eXist. Index: ArraySet.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/dom/ArraySet.java,v retrieving revision 1.44 retrieving revision 1.45 diff -C2 -d -r1.44 -r1.45 *** ArraySet.java 19 Dec 2005 17:10:35 -0000 1.44 --- ArraySet.java 31 Dec 2005 09:49:55 -0000 1.45 *************** *** 124,134 **** } - private final static NodeSet searchRange(NodeProxy[] items, int low, int high, - NodeProxy lower, NodeProxy upper) { - //TODO : dimension as high - low ? - ArraySet result = new ArraySet(100); - return searchRange(result, items, low, high, lower, upper); - } - /** * get all nodes contained in the set, which are greater or equal to lower --- 124,127 ---- *************** *** 143,147 **** *@return Description of the Return Value */ ! private final static NodeSet searchRange(ArraySet result, NodeProxy[] items, int low, int high, NodeProxy lower, NodeProxy upper) { int mid = 0; --- 136,140 ---- *@return Description of the Return Value */ ! private final static void searchRange(NodeSet result, NodeProxy[] items, int low, int high, NodeProxy lower, NodeProxy upper) { int mid = 0; *************** *** 168,173 **** while (mid <= max && items[mid].compareTo(upper) <= 0) result.add(items[mid++]); - - return result; } --- 161,164 ---- *************** *** 352,356 **** int dx; int cmp; - NodeProxy node; final int dlen = dl.length; boolean more = includeSelf ? true : getParentSet(dl, dlen); --- 343,346 ---- *************** *** 493,503 **** } ! public NodeSet getRange(NodeProxy lower, NodeProxy upper) { sort(); ! return searchRange(nodes, 0, counter - 1, lower, upper); } ! public NodeSet getRange(DocumentImpl doc, long lower, long upper) { ! return getRange(new NodeProxy(doc, lower), new NodeProxy(doc, upper)); } --- 483,493 ---- } ! public void getRange(NodeSet result, NodeProxy lower, NodeProxy upper) { sort(); ! searchRange(result, nodes, 0, counter - 1, lower, upper); } ! public void getRange(NodeSet result, DocumentImpl doc, long lower, long upper) { ! getRange(result, new NodeProxy(doc, lower), new NodeProxy(doc, upper)); } Index: AbstractNodeSet.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/dom/AbstractNodeSet.java,v retrieving revision 1.61 retrieving revision 1.62 diff -C2 -d -r1.61 -r1.62 *** AbstractNodeSet.java 17 Dec 2005 15:02:27 -0000 1.61 --- AbstractNodeSet.java 31 Dec 2005 09:49:55 -0000 1.62 *************** *** 235,243 **** */ protected NodeSet hasChildrenInSet( ! NodeProxy parent, int mode, boolean rememberContext) { ! Range range = XMLUtil.getChildRange(parent.getDocument(), parent.getGID()); ! return getRange(parent.getDocument(), range.getStart(), range.getEnd()); } --- 235,249 ---- */ protected NodeSet hasChildrenInSet( ! NodeSet al, int mode, boolean rememberContext) { ! NodeSet result = new ExtArrayNodeSet(); ! NodeProxy node; ! for (Iterator i = al.iterator(); i.hasNext(); ) { ! node = (NodeProxy) i.next(); ! Range range = XMLUtil.getChildRange(node.getDocument(), node.getGID()); ! getRange(result, node.getDocument(), range.getStart(), range.getEnd()); ! } ! return result; } *************** *** 276,281 **** public NodeSet selectParentChild(NodeSet al, int mode, boolean rememberContext) { if (!(al instanceof VirtualNodeSet)) { ! if(al.getLength() == 1) ! return hasChildrenInSet(al.get(0), mode, rememberContext); else return quickSelectParentChild(al, mode, rememberContext); --- 282,287 ---- public NodeSet selectParentChild(NodeSet al, int mode, boolean rememberContext) { if (!(al instanceof VirtualNodeSet)) { ! if(al.getLength() < 10) ! return hasChildrenInSet(al, mode, rememberContext); else return quickSelectParentChild(al, mode, rememberContext); *************** *** 570,574 **** * @return */ ! public NodeSet getRange(DocumentImpl doc, long lower, long upper) { throw new RuntimeException( "getRange is not valid for class " + getClass().getName()); --- 576,580 ---- * @return */ ! public void getRange(NodeSet result, DocumentImpl doc, long lower, long upper) { throw new RuntimeException( "getRange is not valid for class " + getClass().getName()); *************** *** 642,646 **** public NodeSet except(NodeSet other) { AVLTreeNodeSet r = new AVLTreeNodeSet(); ! NodeProxy l, p; for (Iterator i = iterator(); i.hasNext();) { l = (NodeProxy) i.next(); --- 648,652 ---- public NodeSet except(NodeSet other) { AVLTreeNodeSet r = new AVLTreeNodeSet(); ! NodeProxy l; for (Iterator i = iterator(); i.hasNext();) { l = (NodeProxy) i.next(); Index: NodeProxy.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/dom/NodeProxy.java,v retrieving revision 1.73 retrieving revision 1.74 diff -C2 -d -r1.73 -r1.74 *** NodeProxy.java 19 Dec 2005 17:10:35 -0000 1.73 --- NodeProxy.java 31 Dec 2005 09:49:55 -0000 1.74 *************** *** 871,882 **** * @see org.exist.dom.NodeSet#getRange(org.exist.dom.DocumentImpl, long, long) */ ! public NodeSet getRange(DocumentImpl document, long lower, long upper) { if (this.gid < lower) ! return NodeSet.EMPTY_SET; if (this.gid > upper) ! return NodeSet.EMPTY_SET; if(this.doc.getDocId() != document.getDocId()) ! return NodeSet.EMPTY_SET; ! return this; } --- 871,882 ---- * @see org.exist.dom.NodeSet#getRange(org.exist.dom.DocumentImpl, long, long) */ ! public void getRange(NodeSet result, DocumentImpl document, long lower, long upper) { if (this.gid < lower) ! return; if (this.gid > upper) ! return; if(this.doc.getDocId() != document.getDocId()) ! return; ! result.add(this); } Index: ExtArrayNodeSet.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/dom/ExtArrayNodeSet.java,v retrieving revision 1.33 retrieving revision 1.34 diff -C2 -d -r1.33 -r1.34 *** ExtArrayNodeSet.java 17 Dec 2005 19:03:23 -0000 1.33 --- ExtArrayNodeSet.java 31 Dec 2005 09:49:55 -0000 1.34 *************** *** 349,363 **** } ! public NodeSet getRange(DocumentImpl doc, long lower, long upper) { final Part part = getPart(doc, false, 0); ! return part.getRange(lower, upper); } ! public NodeSet hasChildrenInSet(NodeProxy parent, int mode, ! boolean rememberContext) { ! final Part part = getPart(parent.getDocument(), false, 0); ! if (part == null) ! return new ArraySet(1); ! return part.getChildrenInSet(parent, mode, rememberContext); } --- 349,368 ---- } ! public void getRange(NodeSet result, DocumentImpl doc, long lower, long upper) { final Part part = getPart(doc, false, 0); ! part.getRange(result, lower, upper); } ! protected NodeSet hasChildrenInSet(NodeSet al, int mode, boolean rememberContext) { ! NodeSet result = new ExtArrayNodeSet(); ! NodeProxy node; ! Part part; ! for (Iterator i = al.iterator(); i.hasNext(); ) { ! node = (NodeProxy) i.next(); ! part = getPart(node.getDocument(), false, 0); ! if (part != null) ! part.getChildrenInSet(result, node, mode, rememberContext); ! } ! return result; } *************** *** 611,616 **** * @return */ ! NodeSet getChildrenInSet(NodeProxy parent, int mode, boolean rememberContext) { ! NodeSet result = new ExtArrayNodeSet(); // get the range of node ids reserved for children of the parent // node --- 616,620 ---- * @return */ ! NodeSet getChildrenInSet(NodeSet result, NodeProxy parent, int mode, boolean rememberContext) { // get the range of node ids reserved for children of the parent // node *************** *** 661,666 **** } ! NodeSet getRange(long lower, long upper) { ! NodeSet result = new ExtArrayNodeSet((int) (upper - lower) + 1); int low = 0; int high = length - 1; --- 665,669 ---- } ! NodeSet getRange(NodeSet result, long lower, long upper) { int low = 0; int high = length - 1; Index: NodeSet.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/dom/NodeSet.java,v retrieving revision 1.38 retrieving revision 1.39 diff -C2 -d -r1.38 -r1.39 *** NodeSet.java 16 Aug 2005 16:37:45 -0000 1.38 --- NodeSet.java 31 Dec 2005 09:49:55 -0000 1.39 *************** *** 303,308 **** /** ! * Return a sub-range of this node set containing the range of nodes greater than or including ! * the lower node and smaller than or including the upper node. * * @param doc --- 303,309 ---- /** ! * Create a sub-range of this node set containing the range of nodes greater than or including ! * the lower node and smaller than or including the upper node. Matching nodes are added to the ! * given result node set. * * @param doc *************** *** 311,315 **** * @return */ ! public NodeSet getRange(DocumentImpl doc, long lower, long upper); /** --- 312,316 ---- * @return */ ! public void getRange(NodeSet result, DocumentImpl doc, long lower, long upper); /** |
|
From: Pierrick B. <br...@us...> - 2005-12-31 09:07:40
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/storage In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv2909/src/org/exist/storage Modified Files: NativeTextEngine.java NativeElementIndex.java NativeValueIndex.java Log Message: Code cleaning, comments... Index: NativeElementIndex.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/storage/NativeElementIndex.java,v retrieving revision 1.74 retrieving revision 1.75 diff -C2 -d -r1.74 -r1.75 *** NativeElementIndex.java 30 Dec 2005 22:51:44 -0000 1.74 --- NativeElementIndex.java 31 Dec 2005 09:07:32 -0000 1.75 *************** *** 544,548 **** if (is == null) { ! //TODO : Should is be null, what will there be in os.data() ? -pb if (dbNodes.put(ref, os.data()) == BFile.UNKNOWN_ADDRESS) { LOG.warn("Could not put index data for node '" + qname + "'"); --- 544,548 ---- if (is == null) { ! //TOUNDERSTAND : Should is be null, what will there be in os.data() ? -pb if (dbNodes.put(ref, os.data()) == BFile.UNKNOWN_ADDRESS) { LOG.warn("Could not put index data for node '" + qname + "'"); *************** *** 573,576 **** --- 573,577 ---- public NodeSet getAttributesByName(DocumentSet docs, QName qname, NodeSelector selector) { + //TODO : should we consider ElementValue.ATTRIBUTE_ID as well ? -pb return findElementsByTagName(ElementValue.ATTRIBUTE, docs, qname, selector); } *************** *** 589,599 **** public NodeSet findElementsByTagName(byte type, DocumentSet docs, QName qname, NodeSelector selector) { short nodeType; ! switch (type) { case ElementValue.ATTRIBUTE : nodeType = Node.ATTRIBUTE_NODE; ! break; ! //TODO : stricter control -pb ! default : nodeType = Node.ELEMENT_NODE; } final ExtArrayNodeSet result = new ExtArrayNodeSet(docs.getLength(), 256); --- 590,603 ---- public NodeSet findElementsByTagName(byte type, DocumentSet docs, QName qname, NodeSelector selector) { short nodeType; ! switch (type) { ! case ElementValue.ATTRIBUTE_ID : //is this correct ? -pb case ElementValue.ATTRIBUTE : nodeType = Node.ATTRIBUTE_NODE; ! break; ! case ElementValue.ELEMENT : nodeType = Node.ELEMENT_NODE; + break; + default : + throw new IllegalArgumentException("Invalid type"); } final ExtArrayNodeSet result = new ExtArrayNodeSet(docs.getLength(), 256); *************** *** 627,631 **** lock.acquire(Lock.READ_LOCK); is = dbNodes.getAsStream(ref); ! //Does the node already exist in the index ? if (is == null) continue; --- 631,635 ---- lock.acquire(Lock.READ_LOCK); is = dbNodes.getAsStream(ref); ! //Does the node already has data in the index ? if (is == null) continue; Index: NativeTextEngine.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/storage/NativeTextEngine.java,v retrieving revision 1.123 retrieving revision 1.124 diff -C2 -d -r1.123 -r1.124 *** NativeTextEngine.java 30 Dec 2005 22:51:44 -0000 1.123 --- NativeTextEngine.java 31 Dec 2005 09:07:32 -0000 1.124 *************** *** 341,350 **** try { lock.acquire(Lock.WRITE_LOCK); ! changed = false; ! is = dbTokens.getAsStream(ref); os.clear(); ! if (is == null) { ! continue; ! } try { while (is.available() > 0) { --- 341,350 ---- try { lock.acquire(Lock.WRITE_LOCK); ! changed = false; os.clear(); ! is = dbTokens.getAsStream(ref); ! //Does the token already has data in the index ? ! if (is == null) ! continue; try { while (is.available() > 0) { *************** *** 453,457 **** NodeProxy storedNode; NodeProxy parent; ! int sizeHint = -1; Match match; for (Iterator iter = docs.getCollectionIterator(); iter.hasNext();) { --- 453,457 ---- NodeProxy storedNode; NodeProxy parent; ! int sizeHint; Match match; for (Iterator iter = docs.getCollectionIterator(); iter.hasNext();) { *************** *** 464,471 **** lock.acquire(); is = dbTokens.getAsStream(ref); ! //Does the node already exist in the index ? ! if (is == null) { ! continue; ! } while (is.available() > 0) { storedDocId = is.readInt(); --- 464,470 ---- lock.acquire(); is = dbTokens.getAsStream(ref); ! //Does the token already has data in the index ? ! if (is == null) ! continue; while (is.available() > 0) { storedDocId = is.readInt(); *************** *** 881,885 **** lenOffset = os.position(); os.writeFixedInt(0); - previousGID = 0; for (int j = 0; j < occurences.getSize(); ) { --- 880,883 ---- *************** *** 894,899 **** j += freq; } ! os.writeFixedInt(lenOffset, os.position() - lenOffset - 4); ! flushWord(collectionId, token, os.data()); progress.setValue(count); --- 892,896 ---- j += freq; } ! os.writeFixedInt(lenOffset, os.position() - lenOffset - 4); flushWord(collectionId, token, os.data()); progress.setValue(count); *************** *** 914,918 **** } ! private void flushWord(short collectionId, String word, ByteArray data) { //return early //TODO : is this ever called ? -pb --- 911,915 ---- } ! private void flushWord(short collectionId, String token, ByteArray data) { //return early //TODO : is this ever called ? -pb *************** *** 922,926 **** try { lock.acquire(Lock.WRITE_LOCK); ! dbTokens.append(new WordRef(collectionId, word), data); } catch (LockException e) { LOG.warn("Failed to acquire lock for '" + dbTokens.getFile().getName() + "'", e); --- 919,923 ---- try { lock.acquire(Lock.WRITE_LOCK); ! dbTokens.append(new WordRef(collectionId, token), data); } catch (LockException e) { LOG.warn("Failed to acquire lock for '" + dbTokens.getFile().getName() + "'", e); *************** *** 973,977 **** lock.acquire(Lock.WRITE_LOCK); value = dbTokens.get(ref); ! //Does the token already exist in the index ? if (value != null) { //Add its data to the new list --- 970,974 ---- lock.acquire(Lock.WRITE_LOCK); value = dbTokens.get(ref); ! //Does the token already has data in the index ? if (value != null) { //Add its data to the new list *************** *** 1178,1183 **** os.writeInt(termCount); lenOffset = os.position(); ! os.writeFixedInt(0); ! previousGID = 0; for (int m = 0; m < storedOccurencesList.getSize(); ) { --- 1175,1179 ---- os.writeInt(termCount); lenOffset = os.position(); ! os.writeFixedInt(0); previousGID = 0; for (int m = 0; m < storedOccurencesList.getSize(); ) { *************** *** 1196,1199 **** --- 1192,1196 ---- //Store the data if (is == null) { + //TOUNDERSTAND : Should is be null, what will there be in os.data() ? -pb if (dbTokens.put(ref, os.data()) == BFile.UNKNOWN_ADDRESS) { LOG.error("Could not put index data for token '" + token + "'"); *************** *** 1242,1257 **** public boolean indexInfo(Value key, long pointer) throws TerminatedException { if(context != null) ! context.proceed(); ! String word; try { ! word = new String(key.getData(), 2, key.getLength() - 2, "UTF-8"); } catch (UnsupportedEncodingException e) { LOG.error(e.getMessage(), e); //word = new String(key.getData(), 2, key.getLength() - 2); ! word = null; ! } ! if (matcher.matches(word)) ! matches.add(word); ! return true; } } --- 1239,1253 ---- public boolean indexInfo(Value key, long pointer) throws TerminatedException { if(context != null) ! context.proceed(); try { ! String word = new String(key.getData(), 2, key.getLength() - 2, "UTF-8"); ! if (matcher.matches(word)) ! matches.add(word); ! return true; } catch (UnsupportedEncodingException e) { LOG.error(e.getMessage(), e); //word = new String(key.getData(), 2, key.getLength() - 2); ! return true; ! } } } *************** *** 1275,1292 **** } ! public boolean indexInfo(Value key, long pointer) throws TerminatedException { word.reuse(); word = UTF8.decode(key.getData(), 2, key.getLength() - 2, word); ! if (matcher.matches(word)) { ! VariableByteInput is = null; ! try { ! is = dbTokens.getAsStream(pointer); ! } catch (IOException e) { ! LOG.warn(e.getMessage(), e); ! //TODO : return early -pb ! } ! if (is == null) ! return true; ! int storedDocId; byte storedSection; --- 1271,1286 ---- } ! public boolean indexInfo(Value key, long pointer) throws TerminatedException { ! VariableByteInput is; ! try { ! is = dbTokens.getAsStream(pointer); ! } catch (IOException e) { ! LOG.error(e.getMessage(), e); ! return true; ! } ! word.reuse(); word = UTF8.decode(key.getData(), 2, key.getLength() - 2, word); ! if (matcher.matches(word)) { int storedDocId; byte storedSection; *************** *** 1303,1308 **** int sizeHint; try { ! while (is.available() > 0) { ! if(context != null) context.proceed(); --- 1297,1301 ---- int sizeHint; try { ! while (is.available() > 0) { if(context != null) context.proceed(); *************** *** 1391,1413 **** */ public boolean indexInfo(Value key, long pointer) throws TerminatedException { VariableByteInput is; try { is = dbTokens.getAsStream(pointer); } catch (IOException e) { - LOG.warn(e.getMessage(), e); - //TODO : return early -pb - is = null; - } - if (is == null) - return true; - String term; - try { - term = new String(key.getData(), 2, key.getLength() - 2, "UTF-8"); - } catch (UnsupportedEncodingException e) { LOG.error(e.getMessage(), e); ! //term = new String(key.getData(), 2, key.getLength() - 2); ! //TODO : return early ! -pb ! term = null; ! } int storedDocId; byte storedSection; --- 1384,1405 ---- */ public boolean indexInfo(Value key, long pointer) throws TerminatedException { + + String term; + try { + term = new String(key.getData(), 2, key.getLength() - 2, "UTF-8"); + } catch (UnsupportedEncodingException e) { + LOG.error(e.getMessage(), e); + //term = new String(key.getData(), 2, key.getLength() - 2); + return true; + } + VariableByteInput is; try { is = dbTokens.getAsStream(pointer); } catch (IOException e) { LOG.error(e.getMessage(), e); ! return true; ! } ! int storedDocId; byte storedSection; *************** *** 1420,1428 **** NodeProxy parentNode; int freq; ! boolean include = true; boolean docAdded; try { while (is.available() > 0) { ! docAdded = false; storedDocId = is.readInt(); storedSection = is.readByte(); --- 1412,1420 ---- NodeProxy parentNode; int freq; ! boolean include; boolean docAdded; try { while (is.available() > 0) { ! docAdded = false; storedDocId = is.readInt(); storedSection = is.readByte(); Index: NativeValueIndex.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/storage/NativeValueIndex.java,v retrieving revision 1.57 retrieving revision 1.58 diff -C2 -d -r1.57 -r1.58 *** NativeValueIndex.java 30 Dec 2005 22:51:46 -0000 1.57 --- NativeValueIndex.java 31 Dec 2005 09:07:32 -0000 1.58 *************** *** 560,564 **** //Store the data if (is == null) { ! //TODO : Should is be null, what will there be in os.data() ? -pb if (dbValues.put(ref, os.data()) == BFile.UNKNOWN_ADDRESS) { LOG.warn("Could not put index data for value '" + ref + "'"); --- 560,564 ---- //Store the data if (is == null) { ! //TOUNDERSTAND : Should is be null, what will there be in os.data() ? -pb if (dbValues.put(ref, os.data()) == BFile.UNKNOWN_ADDRESS) { LOG.warn("Could not put index data for value '" + ref + "'"); *************** *** 817,829 **** */ public boolean indexInfo(Value value, long pointer) throws TerminatedException { ! VariableByteInput is = null; try { is = dbValues.getAsStream(pointer); } catch (IOException e) { LOG.error(e.getMessage(), e); ! //TODO : return early -pb ! } ! if (is == null) ! return true; int storedDocId; --- 817,827 ---- */ public boolean indexInfo(Value value, long pointer) throws TerminatedException { ! VariableByteInput is; try { is = dbValues.getAsStream(pointer); } catch (IOException e) { LOG.error(e.getMessage(), e); ! return true; ! } int storedDocId; *************** *** 931,947 **** return false; } catch (EXistException e) { ! LOG.warn(e.getMessage(), e); return true; } ! VariableByteInput is = null; try { is = dbValues.getAsStream(pointer); } catch (IOException e) { LOG.error(e.getMessage(), e); - //TODO : return early -pb - } - if (is == null) return true; int storedDocId; --- 929,943 ---- return false; } catch (EXistException e) { ! LOG.error(e.getMessage(), e); return true; } ! VariableByteInput is; try { is = dbValues.getAsStream(pointer); } catch (IOException e) { LOG.error(e.getMessage(), e); return true; + } int storedDocId; *************** *** 954,957 **** --- 950,954 ---- try { while (is.available() > 0) { + docAdded = false; storedDocId = is.readInt(); gidsCount = is.readInt(); *************** *** 961,973 **** is.skip(gidsCount); continue; ! } ! docAdded = false; storedGID = 0; for (int j = 0; j < gidsCount; j++) { delta = is.readLong(); ! storedGID = storedGID + delta; ! //TODO : what if contextSet == null ? -pb ! //See above where we have this behaviour : ! //otherwise, we add all nodes without check if (contextSet != null) { if (contextSet.parentWithChild(storedDocument, storedGID, false, true) != null) { --- 958,966 ---- is.skip(gidsCount); continue; ! } storedGID = 0; for (int j = 0; j < gidsCount; j++) { delta = is.readLong(); ! storedGID = storedGID + delta; if (contextSet != null) { if (contextSet.parentWithChild(storedDocument, storedGID, false, true) != null) { *************** *** 983,986 **** --- 976,982 ---- } } + //TODO : what if contextSet == null ? -pb + //See above where we have this behaviour : + //otherwise, we add all nodes without check } } |
|
From: Pierrick B. <br...@us...> - 2005-12-30 22:51:54
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/storage In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv27460/src/org/exist/storage Modified Files: NativeTextEngine.java NativeElementIndex.java TextSearchEngine.java NativeValueIndexByQName.java NativeBroker.java NativeValueIndex.java Log Message: slight change in some interfaces Index: NativeValueIndex.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/storage/NativeValueIndex.java,v retrieving revision 1.56 retrieving revision 1.57 diff -C2 -d -r1.56 -r1.57 *** NativeValueIndex.java 30 Dec 2005 21:45:45 -0000 1.56 --- NativeValueIndex.java 30 Dec 2005 22:51:46 -0000 1.57 *************** *** 176,179 **** --- 176,199 ---- } + public void storeAttribute(AttrImpl node, NodePath currentPath, boolean fullTextIndexSwitch) { + // TODO Auto-generated method stub + } + + public void storeText(TextImpl node, NodePath currentPath, boolean fullTextIndexSwitch) { + // TODO Auto-generated method stub + } + + public void startElement(ElementImpl impl, NodePath currentPath, boolean index) { + // TODO Auto-generated method stub + } + + public void endElement(int xpathType, ElementImpl node, String content) { + // TODO Auto-generated method stub + } + + public void removeElement(ElementImpl node, NodePath currentPath, String content) { + // TODO Auto-generated method stub + } + /* (non-Javadoc) * @see org.exist.storage.IndexGenerator#sync() *************** *** 765,768 **** --- 785,796 ---- } + public boolean close() throws DBException { + return dbValues.close(); + } + + public void printStatistics() { + dbValues.printStatistics(); + } + public String toString() { return this.getClass().getName() + " at "+ dbValues.getFile().getName() + *************** *** 966,989 **** } } - - public void storeAttribute(AttrImpl node, NodePath currentPath, boolean fullTextIndexSwitch) { - // TODO Auto-generated method stub - } - - public void storeText(TextImpl node, NodePath currentPath, boolean fullTextIndexSwitch) { - // TODO Auto-generated method stub - } - - public void startElement(ElementImpl impl, NodePath currentPath, boolean index) { - // TODO Auto-generated method stub - } - - public void endElement(int xpathType, ElementImpl node, String content) { - // TODO Auto-generated method stub - } - - public void removeElement(ElementImpl node, NodePath currentPath, String content) { - // TODO Auto-generated method stub - } - } --- 994,996 ---- Index: NativeTextEngine.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/storage/NativeTextEngine.java,v retrieving revision 1.122 retrieving revision 1.123 diff -C2 -d -r1.122 -r1.123 *** NativeTextEngine.java 30 Dec 2005 22:16:16 -0000 1.122 --- NativeTextEngine.java 30 Dec 2005 22:51:44 -0000 1.123 *************** *** 223,226 **** --- 223,254 ---- } } + } + + public void storeAttribute(RangeIndexSpec spec, AttrImpl node) { + // TODO Auto-generated method stub + } + + public void setDocument(DocumentImpl document) { + //TODO Auto-generated method stub + } + + public void storeAttribute(AttrImpl node, NodePath currentPath, boolean fullTextIndexSwitch) { + //TODO Auto-generated method stub + } + + public void storeText(TextImpl node, NodePath currentPath, boolean fullTextIndexSwitch) { + // TODO Auto-generated method stub + } + + public void startElement(ElementImpl impl, NodePath currentPath, boolean index) { + // TODO Auto-generated method stub + } + + public void endElement(int xpathType, ElementImpl node, String content) { + // TODO Auto-generated method stub + } + + public void removeElement(ElementImpl node, NodePath currentPath, String content) { + // TODO Auto-generated method stub } *************** *** 741,750 **** } ! public void close() { ! try { ! dbTokens.close(); ! } catch (DBException dbe) { ! LOG.debug(dbe); ! } } --- 769,783 ---- } ! public boolean close() throws DBException { ! return dbTokens.close(); ! } ! ! public void printStatistics() { ! dbTokens.printStatistics(); ! } ! ! public String toString() { ! return this.getClass().getName() + " at "+ dbTokens.getFile().getName() + ! " owned by " + broker.toString(); } *************** *** 1652,1683 **** } } - - public void storeAttribute(RangeIndexSpec spec, AttrImpl node) { - // TODO Auto-generated method stub - } - - public void setDocument(DocumentImpl document) { - //TODO Auto-generated method stub - } - - public void storeAttribute(AttrImpl node, NodePath currentPath, boolean fullTextIndexSwitch) { - //TODO Auto-generated method stub - } - - public void storeText(TextImpl node, NodePath currentPath, boolean fullTextIndexSwitch) { - // TODO Auto-generated method stub - } - - public void startElement(ElementImpl impl, NodePath currentPath, boolean index) { - // TODO Auto-generated method stub - } - - public void endElement(int xpathType, ElementImpl node, String content) { - // TODO Auto-generated method stub - } - - public void removeElement(ElementImpl node, NodePath currentPath, String content) { - // TODO Auto-generated method stub - } } --- 1685,1688 ---- Index: NativeValueIndexByQName.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/storage/NativeValueIndexByQName.java,v retrieving revision 1.22 retrieving revision 1.23 diff -C2 -d -r1.22 -r1.23 *** NativeValueIndexByQName.java 27 Dec 2005 19:50:26 -0000 1.22 --- NativeValueIndexByQName.java 30 Dec 2005 22:51:45 -0000 1.23 *************** *** 306,312 **** } ! public void close() throws DBException { if (qnameValueIndexation) ! dbValues.close(); } --- 306,313 ---- } ! public boolean close() throws DBException { if (qnameValueIndexation) ! return dbValues.close(); ! return true; } Index: TextSearchEngine.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/storage/TextSearchEngine.java,v retrieving revision 1.16 retrieving revision 1.17 diff -C2 -d -r1.16 -r1.17 *** TextSearchEngine.java 9 Feb 2005 08:05:22 -0000 1.16 --- TextSearchEngine.java 30 Dec 2005 22:51:45 -0000 1.17 *************** *** 39,42 **** --- 39,43 ---- import org.exist.storage.analysis.SimpleTokenizer; import org.exist.storage.analysis.Tokenizer; + import org.exist.storage.btree.DBException; import org.exist.storage.serializers.Serializer; import org.exist.util.Configuration; *************** *** 164,168 **** public abstract void flush(); ! public abstract void close(); public int getTrackMatches() { --- 165,169 ---- public abstract void flush(); ! public abstract boolean close() throws DBException; public int getTrackMatches() { Index: NativeBroker.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/storage/NativeBroker.java,v retrieving revision 1.225 retrieving revision 1.226 diff -C2 -d -r1.225 -r1.226 *** NativeBroker.java 27 Dec 2005 21:33:45 -0000 1.225 --- NativeBroker.java 30 Dec 2005 22:51:45 -0000 1.226 *************** *** 2002,2006 **** valuesDbQname.closeAndRemove(); config.setProperty("db-connection2.values", null); ! } LOG.debug("Recreating index files ..."); try { --- 2002,2007 ---- valuesDbQname.closeAndRemove(); config.setProperty("db-connection2.values", null); ! } ! LOG.debug("Recreating index files ..."); try { *************** *** 2684,2692 **** flush(); sync(Sync.MAJOR_SYNC); ! textEngine.close(); domDb.close(); elementsDb.close(); valuesDb.close(); ! collectionsDb.close(); // if (qnameValueIndexation) --- 2685,2693 ---- flush(); sync(Sync.MAJOR_SYNC); ! textEngine.close(); domDb.close(); elementsDb.close(); valuesDb.close(); ! collectionsDb.close(); // if (qnameValueIndexation) *************** *** 2929,2938 **** // uncomment this to get statistics on page buffer usage ! collectionsDb.printStatistics(); elementsDb.printStatistics(); valuesDb.printStatistics(); ! domDb.printStatistics(); if (valuesDbQname != null) valuesDbQname.printStatistics(); } } catch (DBException dbe) { --- 2930,2940 ---- // uncomment this to get statistics on page buffer usage ! collectionsDb.printStatistics(); elementsDb.printStatistics(); valuesDb.printStatistics(); ! domDb.printStatistics(); if (valuesDbQname != null) valuesDbQname.printStatistics(); + textEngine.printStatistics(); } } catch (DBException dbe) { Index: NativeElementIndex.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/storage/NativeElementIndex.java,v retrieving revision 1.73 retrieving revision 1.74 diff -C2 -d -r1.73 -r1.74 *** NativeElementIndex.java 30 Dec 2005 21:45:45 -0000 1.73 --- NativeElementIndex.java 30 Dec 2005 22:51:44 -0000 1.74 *************** *** 101,104 **** --- 101,124 ---- } + public void storeAttribute(AttrImpl node, NodePath currentPath, boolean fullTextIndexSwitch) { + // TODO Auto-generated method stub + } + + public void storeText(TextImpl node, NodePath currentPath, boolean fullTextIndexSwitch) { + // TODO Auto-generated method stub + } + + public void startElement(ElementImpl impl, NodePath currentPath, boolean index) { + // TODO Auto-generated method stub + } + + public void endElement(int xpathType, ElementImpl node, String content) { + // TODO Auto-generated method stub + } + + public void removeElement(ElementImpl node, NodePath currentPath, String content) { + // TODO Auto-generated method stub + } + /* (non-Javadoc) * @see org.exist.storage.ContentLoadingObserver#sync() *************** *** 841,864 **** dbNodes.printStatistics(); } - - public void storeAttribute(AttrImpl node, NodePath currentPath, boolean fullTextIndexSwitch) { - // TODO Auto-generated method stub - } - - public void storeText(TextImpl node, NodePath currentPath, boolean fullTextIndexSwitch) { - // TODO Auto-generated method stub - } - - public void startElement(ElementImpl impl, NodePath currentPath, boolean index) { - // TODO Auto-generated method stub - } - - public void endElement(int xpathType, ElementImpl node, String content) { - // TODO Auto-generated method stub - } - - public void removeElement(ElementImpl node, NodePath currentPath, String content) { - // TODO Auto-generated method stub - } public String toString() { --- 861,864 ---- |
|
From: Pierrick B. <br...@us...> - 2005-12-30 22:16:24
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/storage In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv21517/src/org/exist/storage Modified Files: NativeTextEngine.java Log Message: Code cleaning, comments... Code can now be reviewed. Index: NativeTextEngine.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/storage/NativeTextEngine.java,v retrieving revision 1.121 retrieving revision 1.122 diff -C2 -d -r1.121 -r1.122 *** NativeTextEngine.java 30 Dec 2005 21:45:45 -0000 1.121 --- NativeTextEngine.java 30 Dec 2005 22:16:16 -0000 1.122 *************** *** 1256,1270 **** return true; ! int storedDocId; byte storedSection; ! int termCount; long storedGID; long previousGID; long delta; ! int size; DocumentImpl storedDocument; NodeProxy storedNode; NodeProxy parentNode; ! Match match; int freq; int sizeHint; --- 1256,1270 ---- return true; ! int storedDocId; byte storedSection; ! int termCount; long storedGID; long previousGID; long delta; ! int size; DocumentImpl storedDocument; NodeProxy storedNode; NodeProxy parentNode; ! Match match; int freq; int sizeHint; *************** *** 1357,1363 **** * @see org.dbxml.core.filer.BTreeCallback#indexInfo(org.dbxml.core.data.Value, long) */ ! public boolean indexInfo(Value key, long pointer) ! throws TerminatedException { ! String term; try { term = new String(key.getData(), 2, key.getLength() - 2, "UTF-8"); --- 1357,1372 ---- * @see org.dbxml.core.filer.BTreeCallback#indexInfo(org.dbxml.core.data.Value, long) */ ! public boolean indexInfo(Value key, long pointer) throws TerminatedException { ! VariableByteInput is; ! try { ! is = dbTokens.getAsStream(pointer); ! } catch (IOException e) { ! LOG.warn(e.getMessage(), e); ! //TODO : return early -pb ! is = null; ! } ! if (is == null) ! return true; ! String term; try { term = new String(key.getData(), 2, key.getLength() - 2, "UTF-8"); *************** *** 1365,1410 **** LOG.error(e.getMessage(), e); //term = new String(key.getData(), 2, key.getLength() - 2); term = null; ! } ! Occurrences oc = (Occurrences) map.get(term); ! ! VariableByteInput is = null; ! try { ! is = dbTokens.getAsStream(pointer); ! } catch (IOException e) { ! LOG.warn(e.getMessage(), e); ! //TODO : return early -pb ! } ! if (is == null) ! return true; try { - int docId; - byte section; - - long storedGID; - long previousGID; - long delta; - - int len; - int rawSize; - int freq = 1; - - DocumentImpl doc; - boolean include = true; - boolean docAdded; - NodeProxy p; while (is.available() > 0) { ! docId = is.readInt(); ! section = ! is.readByte(); ! len = is.readInt(); ! rawSize = is.readFixedInt(); ! if ((doc = docs.getDoc(docId)) == null) { ! is.skipBytes(rawSize); continue; ! } ! docAdded = false; previousGID = 0; ! for (int j = 0; j < len; j++) { delta = is.readLong(); storedGID = previousGID + delta; --- 1374,1407 ---- LOG.error(e.getMessage(), e); //term = new String(key.getData(), 2, key.getLength() - 2); + //TODO : return early ! -pb term = null; ! } ! int storedDocId; ! byte storedSection; ! int termCount; ! long storedGID; ! long previousGID; ! long delta; ! int size; ! DocumentImpl storedDocument; ! NodeProxy parentNode; ! int freq; ! boolean include = true; ! boolean docAdded; try { while (is.available() > 0) { ! docAdded = false; ! storedDocId = is.readInt(); ! storedSection = is.readByte(); ! termCount = is.readInt(); ! size = is.readFixedInt(); ! storedDocument = docs.getDoc(storedDocId); ! //Exit if the document is not concerned ! if (storedDocument == null) { ! is.skipBytes(size); continue; ! } previousGID = 0; ! for (int j = 0; j < termCount; j++) { delta = is.readLong(); storedGID = previousGID + delta; *************** *** 1413,1442 **** if (contextSet != null) { include = false; ! p = contextSet.parentWithChild(doc, storedGID, false, true); ! if (p != null) { ! if (section == ATTRIBUTE_SECTION) { ! include = (p.getNodeType() == Node.ATTRIBUTE_NODE); ! } else { ! include = p != null; ! } ! } ! } ! if (include) { ! if (oc == null) { ! oc = new Occurrences(term); ! map.put(term, oc); ! } ! if (!docAdded) { ! oc.addDocument(doc); ! docAdded = true; ! } ! oc.addOccurrences(freq); ! } previousGID = storedGID; } } } catch(EOFException e) { } catch(IOException e) { ! LOG.warn("Exception while scanning index: " + e.getMessage(), e); } return true; --- 1410,1446 ---- if (contextSet != null) { include = false; ! parentNode = contextSet.parentWithChild(storedDocument, storedGID, false, true); ! switch (storedSection) { ! case TEXT_SECTION : ! //TODO : also test on Node.TEXT_NODE like below ? -pb ! include = (parentNode != null); ! break; ! case ATTRIBUTE_SECTION : ! include = (parentNode != null && parentNode.getNodeType() == Node.ATTRIBUTE_NODE); ! break; ! default : ! throw new IllegalArgumentException("Invalid inverted index"); ! } ! if (include) { ! Occurrences oc = (Occurrences) map.get(term); ! if (oc == null) { ! oc = new Occurrences(term); ! map.put(term, oc); ! } ! if (!docAdded) { ! oc.addDocument(storedDocument); ! docAdded = true; ! } ! oc.addOccurrences(freq); ! } ! } previousGID = storedGID; } } } catch(EOFException e) { + //EOFExceptions are expected } catch(IOException e) { ! LOG.error(e.getMessage(), e); ! //TODO : return early -pb } return true; |
|
From: Pierrick B. <br...@us...> - 2005-12-30 21:45:53
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/storage In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv16313/src/org/exist/storage Modified Files: NativeTextEngine.java NativeElementIndex.java NativeValueIndex.java Log Message: Code cleaning, comments... Index: NativeElementIndex.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/storage/NativeElementIndex.java,v retrieving revision 1.72 retrieving revision 1.73 diff -C2 -d -r1.72 -r1.73 *** NativeElementIndex.java 30 Dec 2005 20:21:57 -0000 1.72 --- NativeElementIndex.java 30 Dec 2005 21:45:45 -0000 1.73 *************** *** 616,620 **** size = is.readFixedInt(); storedDocument = docs.getDoc(storedDocId); ! //TOUNDERSTAND : how could this be possible ? -pb if (storedDocument == null) { is.skipBytes(size); --- 616,620 ---- size = is.readFixedInt(); storedDocument = docs.getDoc(storedDocId); ! //Exit if the document is not concerned if (storedDocument == null) { is.skipBytes(size); Index: NativeTextEngine.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/storage/NativeTextEngine.java,v retrieving revision 1.120 retrieving revision 1.121 diff -C2 -d -r1.120 -r1.121 *** NativeTextEngine.java 30 Dec 2005 21:03:20 -0000 1.120 --- NativeTextEngine.java 30 Dec 2005 21:45:45 -0000 1.121 *************** *** 446,462 **** size = is.readFixedInt(); storedDocument = docs.getDoc(storedDocId); ! //TOUNDERSTAND : how could this be possible ? -pb ! if (storedDocument == null) { is.skipBytes(size); continue; ! } ! //TOUNDERSTAND : does a null contextSet makes sense ? -pb if (contextSet != null) { ! //Exit if the current document is not concerned if (!contextSet.containsDoc(storedDocument)) { is.skipBytes(size); continue; ! } ! sizeHint = contextSet.getSizeHint(storedDocument); } //Process the nodes --- 446,460 ---- size = is.readFixedInt(); storedDocument = docs.getDoc(storedDocId); ! //Exit if the document is not concerned ! if (storedDocument == null) { is.skipBytes(size); continue; ! } if (contextSet != null) { ! //Exit if the document is not concerned if (!contextSet.containsDoc(storedDocument)) { is.skipBytes(size); continue; ! } } //Process the nodes *************** *** 494,497 **** --- 492,496 ---- readOccurrences(freq, is, match, token.length()); parent.addMatch(match); + sizeHint = contextSet.getSizeHint(storedDocument); result.add(parent, sizeHint); } else { *************** *** 503,507 **** readOccurrences(freq, is, match, token.length()); storedNode.addMatch(match); ! result.add(storedNode, sizeHint); } context.proceed(); --- 502,506 ---- readOccurrences(freq, is, match, token.length()); storedNode.addMatch(match); ! result.add(storedNode, -1); } context.proceed(); *************** *** 837,844 **** os.writeInt(this.doc.getDocId()); switch (currentSection) { ! case 0 : os.writeByte(TEXT_SECTION); break; ! case 1 : os.writeByte(ATTRIBUTE_SECTION); break; --- 836,843 ---- os.writeInt(this.doc.getDocId()); switch (currentSection) { ! case TEXT_SECTION : os.writeByte(TEXT_SECTION); break; ! case ATTRIBUTE_SECTION : os.writeByte(ATTRIBUTE_SECTION); break; *************** *** 991,998 **** os.writeInt(this.doc.getDocId()); switch (currentSection) { ! case 0 : os.writeByte(TEXT_SECTION); break; ! case 1 : os.writeByte(ATTRIBUTE_SECTION); break; --- 990,997 ---- os.writeInt(this.doc.getDocId()); switch (currentSection) { ! case TEXT_SECTION : os.writeByte(TEXT_SECTION); break; ! case ATTRIBUTE_SECTION : os.writeByte(ATTRIBUTE_SECTION); break; *************** *** 1048,1052 **** public void reindex(DocumentImpl document, NodeImpl node) { OccurrenceList storedOccurencesList; ! int termCount; long storedGID; long previousGID; --- 1047,1051 ---- public void reindex(DocumentImpl document, NodeImpl node) { OccurrenceList storedOccurencesList; ! int termCount; long storedGID; long previousGID; *************** *** 1135,1142 **** os.writeInt(document.getDocId()); switch (currentSection) { ! case 0 : os.writeByte(TEXT_SECTION); break; ! case 1 : os.writeByte(ATTRIBUTE_SECTION); break; --- 1134,1141 ---- os.writeInt(document.getDocId()); switch (currentSection) { ! case TEXT_SECTION : os.writeByte(TEXT_SECTION); break; ! case ATTRIBUTE_SECTION : os.writeByte(ATTRIBUTE_SECTION); break; *************** *** 1255,1316 **** } if (is == null) ! return true; ! ! int docId; ! int len; ! int rawSize; ! long gid; ! long last = -1; ! int freq = 1; ! int sizeHint = -1; ! byte section; ! DocumentImpl doc; ! NodeProxy parent; ! NodeProxy proxy; Match match; try { while (is.available() > 0) { if(context != null) ! context.proceed(); ! docId = is.readInt(); ! section = is.readByte(); ! len = is.readInt(); ! rawSize = is.readFixedInt(); ! if ((doc = docs.getDoc(docId)) == null) { ! is.skipBytes(rawSize); continue; ! } ! if (contextSet != null) ! sizeHint = contextSet.getSizeHint(doc); ! last = 0; ! for (int j = 0; j < len; j++) { ! gid = last + is.readLong(); ! freq = is.readInt(); ! last = gid; ! proxy = (section == TEXT_SECTION ! ? new NodeProxy(doc, gid, ! Node.TEXT_NODE) ! : new NodeProxy(doc, gid, ! Node.ATTRIBUTE_NODE)); if (contextSet != null) { ! if (section == TEXT_SECTION) ! parent = contextSet.parentWithChild(proxy, false, ! true, NodeProxy.UNKNOWN_NODE_LEVEL); ! else ! parent = contextSet.get(proxy); ! if (parent != null) { ! match = new Match(gid, word.toString(), freq); readOccurrences(freq, is, match, word.length()); ! parent.addMatch(match); ! result.add(parent, sizeHint); } else is.skip(freq); } else { ! match = new Match(gid, word.toString(), freq); readOccurrences(freq, is, match, word.length()); ! proxy.addMatch(match); ! result.add(proxy, sizeHint); } } } --- 1254,1328 ---- } if (is == null) ! return true; ! int storedDocId; ! byte storedSection; ! int termCount; ! long storedGID; ! long previousGID; ! long delta; ! int size; ! DocumentImpl storedDocument; ! NodeProxy storedNode; ! NodeProxy parentNode; Match match; + int freq; + int sizeHint; try { while (is.available() > 0) { + if(context != null) ! context.proceed(); ! storedDocId = is.readInt(); ! storedSection = is.readByte(); ! termCount = is.readInt(); ! size = is.readFixedInt(); ! storedDocument = docs.getDoc(storedDocId); ! //Exit if the document is not concerned ! if (storedDocument == null) { ! is.skipBytes(size); continue; ! } ! previousGID = 0; ! for (int j = 0; j < termCount; j++) { ! delta = is.readLong(); ! storedGID = previousGID + delta; ! freq = is.readInt(); ! switch (storedSection) { ! case TEXT_SECTION : ! storedNode = new NodeProxy(storedDocument, storedGID, Node.TEXT_NODE); ! break; ! case ATTRIBUTE_SECTION : ! storedNode = new NodeProxy(storedDocument, storedGID, Node.ATTRIBUTE_NODE); ! break; ! default : ! throw new IllegalArgumentException("Invalid inverted index"); ! } if (contextSet != null) { ! switch (storedSection) { ! case TEXT_SECTION : ! parentNode = contextSet.parentWithChild(storedNode, false, true, NodeProxy.UNKNOWN_NODE_LEVEL); ! break; ! case ATTRIBUTE_SECTION : ! parentNode = contextSet.get(storedNode); ! break; ! default : ! throw new IllegalArgumentException("Invalid inverted index"); ! } ! if (parentNode != null) { ! match = new Match(storedGID, word.toString(), freq); readOccurrences(freq, is, match, word.length()); ! parentNode.addMatch(match); ! sizeHint = contextSet.getSizeHint(storedDocument); ! result.add(parentNode, sizeHint); } else is.skip(freq); } else { ! match = new Match(storedGID, word.toString(), freq); readOccurrences(freq, is, match, word.length()); ! storedNode.addMatch(match); ! result.add(storedNode, -1); } + previousGID = storedGID; } } *************** *** 1318,1327 **** // EOFExceptions are normal } catch (IOException e) { ! LOG.error("io error while reading index", e); //TODO : return early -pb } } if (contextSet != null) ((ExtArrayNodeSet) result).sort(); return true; } --- 1330,1342 ---- // EOFExceptions are normal } catch (IOException e) { ! LOG.error(e.getMessage(), e); //TODO : return early -pb } } + + //TOUNDERSTAND : why sort here ? -pb if (contextSet != null) ((ExtArrayNodeSet) result).sort(); + return true; } *************** *** 1366,1372 **** int docId; byte section; ! int len, rawSize; int freq = 1; ! long gid; DocumentImpl doc; boolean include = true; --- 1381,1393 ---- int docId; byte section; ! ! long storedGID; ! long previousGID; ! long delta; ! ! int len; ! int rawSize; int freq = 1; ! DocumentImpl doc; boolean include = true; *************** *** 1384,1395 **** } docAdded = false; ! gid = 0; for (int j = 0; j < len; j++) { ! gid += is.readLong(); freq = is.readInt(); is.skip(freq); if (contextSet != null) { include = false; ! p = contextSet.parentWithChild(doc, gid, false, true); if (p != null) { if (section == ATTRIBUTE_SECTION) { --- 1405,1417 ---- } docAdded = false; ! previousGID = 0; for (int j = 0; j < len; j++) { ! delta = is.readLong(); ! storedGID = previousGID + delta; freq = is.readInt(); is.skip(freq); if (contextSet != null) { include = false; ! p = contextSet.parentWithChild(doc, storedGID, false, true); if (p != null) { if (section == ATTRIBUTE_SECTION) { *************** *** 1411,1414 **** --- 1433,1437 ---- oc.addOccurrences(freq); } + previousGID = storedGID; } } Index: NativeValueIndex.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/storage/NativeValueIndex.java,v retrieving revision 1.55 retrieving revision 1.56 diff -C2 -d -r1.55 -r1.56 *** NativeValueIndex.java 30 Dec 2005 20:21:57 -0000 1.55 --- NativeValueIndex.java 30 Dec 2005 21:45:45 -0000 1.56 *************** *** 811,826 **** gidsCount = is.readInt(); storedDocument = docs.getDoc(storedDocId); ! //TOUNDERSTAND : how could this be possible ? -pb if (storedDocument == null) { is.skip(gidsCount); continue; ! } ! //TOUNDERSTAND : does a null contextSet makes sense ? -pb if (contextSet != null) { if (!contextSet.containsDoc(storedDocument)) { is.skip(gidsCount); continue; ! } ! sizeHint = contextSet.getSizeHint(storedDocument); } //Process the nodes --- 811,825 ---- gidsCount = is.readInt(); storedDocument = docs.getDoc(storedDocId); ! //Exit if the document is not concerned if (storedDocument == null) { is.skip(gidsCount); continue; ! } if (contextSet != null) { + //Exit if the document is not concerned if (!contextSet.containsDoc(storedDocument)) { is.skip(gidsCount); continue; ! } } //Process the nodes *************** *** 834,837 **** --- 833,837 ---- // in the context set. if (contextSet != null) { + sizeHint = contextSet.getSizeHint(storedDocument); if (returnAncestor) { parentNode = contextSet.parentWithChild(storedNode, false, true, NodeProxy.UNKNOWN_NODE_LEVEL); *************** *** 842,846 **** // otherwise, we add all nodes without check } else { ! result.add(storedNode, sizeHint); } } --- 842,846 ---- // otherwise, we add all nodes without check } else { ! result.add(storedNode, -1); } } *************** *** 928,932 **** storedDocId = is.readInt(); gidsCount = is.readInt(); ! storedDocument = docs.getDoc(storedDocId); if (storedDocument == null) { is.skip(gidsCount); --- 928,933 ---- storedDocId = is.readInt(); gidsCount = is.readInt(); ! storedDocument = docs.getDoc(storedDocId); ! //Exit if the document is not concerned if (storedDocument == null) { is.skip(gidsCount); |
|
From: Pierrick B. <br...@us...> - 2005-12-30 21:03:32
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/storage In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv7886/src/org/exist/storage Modified Files: NativeTextEngine.java Log Message: Code cleaning, comments... Index: NativeTextEngine.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/storage/NativeTextEngine.java,v retrieving revision 1.119 retrieving revision 1.120 diff -C2 -d -r1.119 -r1.120 *** NativeTextEngine.java 30 Dec 2005 20:21:57 -0000 1.119 --- NativeTextEngine.java 30 Dec 2005 21:03:20 -0000 1.120 *************** *** 709,712 **** --- 709,713 ---- //s = new String(data, 1, data.length - 1); LOG.error(e.getMessage(), e); + s = null; } break; *************** *** 731,734 **** --- 732,736 ---- // - 1 - Signatures.getLength(idSizeType)); LOG.error(e.getMessage(), e); + val = null; } break; *************** *** 1101,1104 **** --- 1103,1107 ---- if (document.getTreeLevel(storedGID) < document.reindexRequired()) { for (int l = 0; l < freq; l++) { + //Note that we use the existing list storedOccurencesList.add(storedGID, is.readInt()); } *************** *** 1112,1115 **** --- 1115,1119 ---- storedGID)) { for (int l = 0; l < freq; l++) { + //Note that we use the existing list storedOccurencesList.add(storedGID, is.readInt()); } *************** *** 1125,1172 **** } } ! termCount = storedOccurencesList.getTermCount(); ! storedOccurencesList.sort(); ! os.writeInt(document.getDocId()); ! switch (currentSection) { ! case 0 : ! os.writeByte(TEXT_SECTION); ! break; ! case 1 : ! os.writeByte(ATTRIBUTE_SECTION); ! break; ! default : ! throw new IllegalArgumentException("Invalid inverted index"); ! } ! os.writeInt(termCount); ! lenOffset = os.position(); ! os.writeFixedInt(0); ! ! previousGID = 0; ! for (int m = 0; m < storedOccurencesList.getSize(); ) { ! delta = storedOccurencesList.nodes[m] - previousGID; ! os.writeLong(delta); ! freq = storedOccurencesList.getOccurrences(m); ! os.writeInt(freq); ! for (int n = 0; n < freq; n++) { ! os.writeInt(storedOccurencesList.offsets[m + n]); } ! previousGID = storedOccurencesList.nodes[m]; ! m += freq; ! } ! os.writeFixedInt(lenOffset, os.position() - lenOffset - 4); ! ! try { ! if (is == null) ! dbTokens.put(ref, os.data()); ! else { ! dbTokens.update(((BFile.PageInputStream) is) ! .getAddress(), ref, os.data()); ! } ! } catch (ReadOnlyException e) { ! //EOF are expected here -pb ! } } catch (LockException e) { LOG.warn("Failed to acquire lock for '" + dbTokens.getFile().getName() + "'", e); is = null; } catch (IOException e) { LOG.error("io error while reindexing word '" + token + "'"); --- 1129,1180 ---- } } ! if (storedOccurencesList.getSize() > 0) { ! //append the data from the new list ! termCount = storedOccurencesList.getTermCount(); ! storedOccurencesList.sort(); ! os.writeInt(document.getDocId()); ! switch (currentSection) { ! case 0 : ! os.writeByte(TEXT_SECTION); ! break; ! case 1 : ! os.writeByte(ATTRIBUTE_SECTION); ! break; ! default : ! throw new IllegalArgumentException("Invalid inverted index"); ! } ! os.writeInt(termCount); ! lenOffset = os.position(); ! os.writeFixedInt(0); ! ! previousGID = 0; ! for (int m = 0; m < storedOccurencesList.getSize(); ) { ! delta = storedOccurencesList.nodes[m] - previousGID; ! os.writeLong(delta); ! freq = storedOccurencesList.getOccurrences(m); ! os.writeInt(freq); ! for (int n = 0; n < freq; n++) { ! os.writeInt(storedOccurencesList.offsets[m + n]); ! } ! previousGID = storedOccurencesList.nodes[m]; ! m += freq; ! } ! os.writeFixedInt(lenOffset, os.position() - lenOffset - 4); ! } ! //Store the data ! if (is == null) { ! if (dbTokens.put(ref, os.data()) == BFile.UNKNOWN_ADDRESS) { ! LOG.error("Could not put index data for token '" + token + "'"); } ! }else { ! if (dbTokens.update(((BFile.PageInputStream) is).getAddress(), ref, os.data()) == BFile.UNKNOWN_ADDRESS) { ! LOG.error("Could not update index data for value '" + token + "'"); ! } ! } } catch (LockException e) { LOG.warn("Failed to acquire lock for '" + dbTokens.getFile().getName() + "'", e); is = null; + } catch (ReadOnlyException e) { + LOG.warn("Read-only error on '" + dbTokens.getFile().getName() + "'", e); } catch (IOException e) { LOG.error("io error while reindexing word '" + token + "'"); *************** *** 1205,1213 **** String word; try { ! word = new String(key.getData(), 2, key.getLength() - 2, ! "UTF-8"); ! } catch (UnsupportedEncodingException uee) { ! word = new String(key.getData(), 2, key.getLength() - 2); ! } if (matcher.matches(word)) matches.add(word); --- 1213,1222 ---- String word; try { ! word = new String(key.getData(), 2, key.getLength() - 2, "UTF-8"); ! } catch (UnsupportedEncodingException e) { ! LOG.error(e.getMessage(), e); ! //word = new String(key.getData(), 2, key.getLength() - 2); ! word = null; ! } if (matcher.matches(word)) matches.add(word); *************** *** 1247,1253 **** if (is == null) return true; ! // int k = 0; int docId; ! int len, rawSize; long gid; long last = -1; --- 1256,1264 ---- if (is == null) return true; ! ! int docId; ! int len; ! int rawSize; long gid; long last = -1; *************** *** 1256,1260 **** byte section; DocumentImpl doc; ! NodeProxy parent, proxy; Match match; try { --- 1267,1272 ---- byte section; DocumentImpl doc; ! NodeProxy parent; ! NodeProxy proxy; Match match; try { *************** *** 1334,1341 **** String term; try { ! term = new String(key.getData(), 2, key.getLength() - 2, ! "UTF-8"); ! } catch (UnsupportedEncodingException uee) { ! term = new String(key.getData(), 2, key.getLength() - 2); } Occurrences oc = (Occurrences) map.get(term); --- 1346,1354 ---- String term; try { ! term = new String(key.getData(), 2, key.getLength() - 2, "UTF-8"); ! } catch (UnsupportedEncodingException e) { ! LOG.error(e.getMessage(), e); ! //term = new String(key.getData(), 2, key.getLength() - 2); ! term = null; } Occurrences oc = (Occurrences) map.get(term); |
|
From: Pierrick B. <br...@us...> - 2005-12-30 20:22:06
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/storage In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv1322/src/org/exist/storage Modified Files: NativeTextEngine.java NativeElementIndex.java NativeValueIndex.java Log Message: Refactored how IOExceptions are handled in indexing code. Index: NativeElementIndex.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/storage/NativeElementIndex.java,v retrieving revision 1.71 retrieving revision 1.72 diff -C2 -d -r1.71 -r1.72 *** NativeElementIndex.java 30 Dec 2005 19:29:41 -0000 1.71 --- NativeElementIndex.java 30 Dec 2005 20:21:57 -0000 1.72 *************** *** 178,182 **** //Store the data if (dbNodes.append(ref, os.data()) == BFile.UNKNOWN_ADDRESS) { ! LOG.warn("Could not put index data for node '" + qname + "'"); } } catch (LockException e) { --- 178,182 ---- //Store the data if (dbNodes.append(ref, os.data()) == BFile.UNKNOWN_ADDRESS) { ! LOG.error("Could not put index data for node '" + qname + "'"); } } catch (LockException e) { *************** *** 283,290 **** } } catch (EOFException e) { LOG.warn(e.getMessage(), e); - } catch (IOException e) { - LOG.error(e.getMessage(), e); - //TODO : data will be saved although os is probably corrupted ! -pb } //append the data from the new list --- 283,288 ---- } } catch (EOFException e) { + //TODO : remove this block if unexpected -pb LOG.warn(e.getMessage(), e); } //append the data from the new list *************** *** 321,325 **** LOG.warn("Failed to acquire lock for '" + dbNodes.getFile().getName() + "'", e); } catch (ReadOnlyException e) { ! LOG.warn("Read-only error on '" + dbNodes.getFile().getName() + "'", e); } finally { lock.release(); --- 319,325 ---- LOG.warn("Failed to acquire lock for '" + dbNodes.getFile().getName() + "'", e); } catch (ReadOnlyException e) { ! LOG.warn("Read-only error on '" + dbNodes.getFile().getName() + "'", e); ! } catch (IOException e) { ! LOG.error(e.getMessage(), e); } finally { lock.release(); *************** *** 501,507 **** } catch (EOFException e) { //EOFExceptions expected there - } catch (IOException e) { - LOG.error(e.getMessage(), e); - //TODO : data will be saved although os is probably corrupted ! -pb } } --- 501,504 ---- *************** *** 720,727 **** } } catch (EOFException e) { ! LOG.warn(e.getMessage(), e); ! } catch (IOException e) { ! LOG.error(e.getMessage(), e); ! //TODO : return null ? -pb } } --- 717,722 ---- } } catch (EOFException e) { ! //TODO : remove this block if unexpected -pb ! LOG.warn(e.getMessage(), e); } } *************** *** 809,816 **** } } catch (EOFException e) { LOG.warn(e.getMessage(), e); - } catch (IOException e) { - LOG.error(e.getMessage(), e); - //TODO : throw an exception ? -pb } LOG.debug(msg.toString()); --- 804,809 ---- } } catch (EOFException e) { + //TODO : remove this block if unexpected -pb LOG.warn(e.getMessage(), e); } LOG.debug(msg.toString()); Index: NativeTextEngine.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/storage/NativeTextEngine.java,v retrieving revision 1.118 retrieving revision 1.119 diff -C2 -d -r1.118 -r1.119 *** NativeTextEngine.java 30 Dec 2005 19:54:02 -0000 1.118 --- NativeTextEngine.java 30 Dec 2005 20:21:57 -0000 1.119 *************** *** 980,987 **** } catch (EOFException e) { //Is it expected ? -pb ! LOG.warn(e.getMessage(), e); ! } catch (IOException e) { ! LOG.error(e.getMessage(), e); ! //TODO : data will be saved although os is probably corrupted ! -pb } //append the data from the new list --- 980,984 ---- } catch (EOFException e) { //Is it expected ? -pb ! LOG.warn(e.getMessage(), e); } //append the data from the new list *************** *** 1036,1040 **** LOG.warn("Failed to acquire lock for '" + dbTokens.getFile().getName() + "'", e); } catch (ReadOnlyException e) { ! LOG.warn("Read-only error on '" + dbTokens.getFile().getName() + "'", e); } finally { lock.release(); --- 1033,1039 ---- LOG.warn("Failed to acquire lock for '" + dbTokens.getFile().getName() + "'", e); } catch (ReadOnlyException e) { ! LOG.warn("Read-only error on '" + dbTokens.getFile().getName() + "'", e); ! } catch (IOException e) { ! LOG.error(e.getMessage(), e); } finally { lock.release(); *************** *** 1123,1131 **** } } catch (EOFException e) { ! //LOG.error("end-of-file while reading index entry ! // for " + word, e); ! } catch (IOException e) { ! LOG.error("io-error while reading index entry for " ! + token, e); } } --- 1122,1126 ---- } } catch (EOFException e) { ! //EOF is expected here } } *************** *** 1169,1173 **** } } catch (ReadOnlyException e) { ! } } catch (LockException e) { --- 1164,1168 ---- } } catch (ReadOnlyException e) { ! //EOF are expected here -pb } } catch (LockException e) { *************** *** 1246,1251 **** try { is = dbTokens.getAsStream(pointer); ! } catch (IOException ioe) { ! LOG.warn(ioe.getMessage(), ioe); } if (is == null) --- 1241,1247 ---- try { is = dbTokens.getAsStream(pointer); ! } catch (IOException e) { ! LOG.warn(e.getMessage(), e); ! //TODO : return early -pb } if (is == null) *************** *** 1310,1314 **** // EOFExceptions are normal } catch (IOException e) { ! LOG.warn("io error while reading index", e); } } --- 1306,1311 ---- // EOFExceptions are normal } catch (IOException e) { ! LOG.error("io error while reading index", e); ! //TODO : return early -pb } } *************** *** 1347,1352 **** try { is = dbTokens.getAsStream(pointer); ! } catch (IOException ioe) { ! LOG.warn(ioe.getMessage(), ioe); } if (is == null) --- 1344,1350 ---- try { is = dbTokens.getAsStream(pointer); ! } catch (IOException e) { ! LOG.warn(e.getMessage(), e); ! //TODO : return early -pb } if (is == null) Index: NativeValueIndex.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/storage/NativeValueIndex.java,v retrieving revision 1.54 retrieving revision 1.55 diff -C2 -d -r1.54 -r1.55 *** NativeValueIndex.java 30 Dec 2005 19:54:02 -0000 1.54 --- NativeValueIndex.java 30 Dec 2005 20:21:57 -0000 1.55 *************** *** 321,329 **** } } catch (EOFException e) { ! //Is it expected ? -pb LOG.warn(e.getMessage(), e); - } catch (IOException e) { - LOG.error(e.getMessage(), e); - //TODO : data will be saved although os is probably corrupted ! -pb } //append the data from the new list --- 321,326 ---- } } catch (EOFException e) { ! //Is it expected ? if not, remove the block -pb LOG.warn(e.getMessage(), e); } //append the data from the new list *************** *** 357,361 **** //TODO : return ? } catch (ReadOnlyException e) { ! LOG.warn("Read-only error on '" + dbValues.getFile().getName() + "'", e); } finally { lock.release(); --- 354,360 ---- //TODO : return ? } catch (ReadOnlyException e) { ! LOG.warn("Read-only error on '" + dbValues.getFile().getName() + "'", e); ! } catch (IOException e) { ! LOG.error(e.getMessage(), e); } finally { lock.release(); *************** *** 585,599 **** Value keyPrefix = computeKeyPrefix(value.getType(), collectionId); try { ! lock.acquire(); ! try { ! dbValues.query(query, keyPrefix, callback); ! } catch (IOException e) { ! LOG.error(e.getMessage(), e); ! } catch (BTreeException e) { ! LOG.warn(e.getMessage(), e); ! } } catch (LockException e) { LOG.warn("Failed to acquire lock for '" + dbValues.getFile().getName() + "'", e); ! } finally { lock.release(); } --- 584,596 ---- Value keyPrefix = computeKeyPrefix(value.getType(), collectionId); try { ! lock.acquire(); ! dbValues.query(query, keyPrefix, callback); } catch (LockException e) { LOG.warn("Failed to acquire lock for '" + dbValues.getFile().getName() + "'", e); ! } catch (IOException e) { ! LOG.error(e.getMessage(), e); ! } catch (BTreeException e) { ! LOG.error(e.getMessage(), e); ! } finally { lock.release(); } *************** *** 796,800 **** is = dbValues.getAsStream(pointer); } catch (IOException e) { ! LOG.warn(e.getMessage(), e); } if (is == null) --- 793,798 ---- is = dbValues.getAsStream(pointer); } catch (IOException e) { ! LOG.error(e.getMessage(), e); ! //TODO : return early -pb } if (is == null) *************** *** 848,853 **** } } ! } catch (EOFException e) { ! // EOF is expected here } catch (IOException e) { LOG.error(e.getMessage(), e); --- 846,851 ---- } } ! } catch (EOFException e) { ! // EOF is expected here } catch (IOException e) { LOG.error(e.getMessage(), e); *************** *** 913,917 **** is = dbValues.getAsStream(pointer); } catch (IOException e) { ! LOG.warn(e.getMessage(), e); } if (is == null) --- 911,916 ---- is = dbValues.getAsStream(pointer); } catch (IOException e) { ! LOG.error(e.getMessage(), e); ! //TODO : return early -pb } if (is == null) |
|
From: Wolfgang M. M. <wol...@us...> - 2005-12-30 20:15:58
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/util In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv32492/src/org/exist/util Modified Files: FastQSort.java HeapSort.java Log Message: Removed warnings about character encodings in comments. Index: FastQSort.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/util/FastQSort.java,v retrieving revision 1.7 retrieving revision 1.8 diff -C2 -d -r1.7 -r1.8 *** FastQSort.java 17 Dec 2005 15:01:16 -0000 1.7 --- FastQSort.java 30 Dec 2005 20:15:47 -0000 1.8 *************** *** 50,54 **** http://www.michael-maniscalco.com/sorting.htm ! @author José María Fernández */ public final class FastQSort { --- 50,54 ---- http://www.michael-maniscalco.com/sorting.htm ! @author Jose Maria Fernandez */ public final class FastQSort { Index: HeapSort.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/util/HeapSort.java,v retrieving revision 1.1 retrieving revision 1.2 diff -C2 -d -r1.1 -r1.2 *** HeapSort.java 11 Dec 2005 19:26:24 -0000 1.1 --- HeapSort.java 30 Dec 2005 20:15:47 -0000 1.2 *************** *** 38,42 **** framework by Cay Horstmann. ! @author José María Fernández */ public final class HeapSort { --- 38,42 ---- framework by Cay Horstmann. ! @author Jose Maria Fernandez */ public final class HeapSort { |
|
From: Wolfgang M. M. <wol...@us...> - 2005-12-30 20:14:33
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/dom In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv32292/src/org/exist/dom Modified Files: DocumentSet.java Log Message: DocumentSet.equals() should test for object identity. If the two objects are identical, we don't need to compare them. The operation is expensive. Index: DocumentSet.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/dom/DocumentSet.java,v retrieving revision 1.22 retrieving revision 1.23 diff -C2 -d -r1.22 -r1.23 *** DocumentSet.java 16 Dec 2005 21:28:50 -0000 1.22 --- DocumentSet.java 30 Dec 2005 20:14:24 -0000 1.23 *************** *** 28,32 **** import java.util.TreeSet; - import org.apache.log4j.Logger; import org.exist.collections.Collection; import org.exist.security.Permission; --- 28,31 ---- *************** *** 51,56 **** public final static DocumentSet EMPTY_DOCUMENT_SET = new DocumentSet(9); - - private final static Logger LOG = Logger.getLogger(DocumentSet.class.getName()); private ArrayList list = null; --- 50,53 ---- *************** *** 246,249 **** --- 243,249 ---- public boolean equals(Object other) { + if (this == other) + // we are comparing the same objects + return true; final DocumentSet o = (DocumentSet) other; if (size() != o.size()) |
|
From: Wolfgang M. M. <wol...@us...> - 2005-12-30 20:10:46
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/xquery/update In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv31601/src/org/exist/xquery/update Modified Files: Modification.java Log Message: * Improve speed of xpath steps nested within an outer for loop, e.g.: for $a in //ab let $b := $a/de return $b/@fg. In this case we know that we have to compute $a/de and $b/@fg for all instantiations of $a, so we could preload the corresponding node sets once, cache them and reuse them for all iterations. * Fixed: some cached fields were not properly cleared in the compiled expression tree. Index: Modification.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/update/Modification.java,v retrieving revision 1.6 retrieving revision 1.7 diff -C2 -d -r1.6 -r1.7 *** Modification.java 18 Dec 2005 11:43:49 -0000 1.6 --- Modification.java 30 Dec 2005 20:10:37 -0000 1.7 *************** *** 93,96 **** --- 93,97 ---- */ public void resetState() { + super.resetState(); select.resetState(); if (value != null) |
|
From: Wolfgang M. M. <wol...@us...> - 2005-12-30 20:08:02
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/xquery/functions In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv31086/src/org/exist/xquery/functions Modified Files: FunDocAvailable.java FunDoc.java ExtFulltext.java Log Message: * Improve speed of xpath steps nested within an outer for loop, e.g.: for $a in //ab let $b := $a/de return $b/@fg. In this case we know that we have to compute $a/de and $b/@fg for all instantiations of $a, so we could preload the corresponding node sets once, cache them and reuse them for all iterations. * Fixed: some cached fields were not properly cleared in the compiled expression tree. Index: ExtFulltext.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/functions/ExtFulltext.java,v retrieving revision 1.18 retrieving revision 1.19 diff -C2 -d -r1.18 -r1.19 *** ExtFulltext.java 6 Dec 2005 17:50:29 -0000 1.18 --- ExtFulltext.java 30 Dec 2005 20:07:53 -0000 1.19 *************** *** 262,269 **** --- 262,275 ---- } + public void setContextDocSet(DocumentSet contextSet) { + super.setContextDocSet(contextSet); + path.setContextDocSet(contextSet); + } + /* (non-Javadoc) * @see org.exist.xquery.PathExpr#resetState() */ public void resetState() { + super.resetState(); path.resetState(); searchTerm.resetState(); Index: FunDoc.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/functions/FunDoc.java,v retrieving revision 1.22 retrieving revision 1.23 diff -C2 -d -r1.22 -r1.23 *** FunDoc.java 12 Dec 2005 20:32:48 -0000 1.22 --- FunDoc.java 30 Dec 2005 20:07:53 -0000 1.23 *************** *** 115,118 **** --- 115,119 ---- */ public void resetState() { + super.resetState(); getArgument(0).resetState(); } Index: FunDocAvailable.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/functions/FunDocAvailable.java,v retrieving revision 1.6 retrieving revision 1.7 diff -C2 -d -r1.6 -r1.7 *** FunDocAvailable.java 12 Dec 2005 20:32:48 -0000 1.6 --- FunDocAvailable.java 30 Dec 2005 20:07:53 -0000 1.7 *************** *** 108,111 **** --- 108,112 ---- */ public void resetState() { + super.resetState(); getArgument(0).resetState(); } |
Update of /cvsroot/exist/eXist-1.0/src/org/exist/xquery In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv30832/src/org/exist/xquery Modified Files: AbstractExpression.java DynamicCardinalityCheck.java DynamicNameCheck.java LocationStep.java UserDefinedFunction.java ConditionalExpression.java Step.java AttributeConstructor.java CombiningExpression.java UntypedValueCheck.java GeneralComparison.java LetExpr.java AtomicToString.java SimpleStep.java FilteredExpression.java Predicate.java Expression.java CastableExpression.java PathExpr.java DynamicTypeCheck.java SequenceConstructor.java Atomize.java CastExpression.java InstanceOfExpression.java NodeConstructor.java BindingExpression.java VariableDeclaration.java BinaryOp.java FunctionCall.java ElementConstructor.java Log Message: * Improve speed of xpath steps nested within an outer for loop, e.g.: for $a in //ab let $b := $a/de return $b/@fg. In this case we know that we have to compute $a/de and $b/@fg for all instantiations of $a, so we could preload the corresponding node sets once, cache them and reuse them for all iterations. * Fixed: some cached fields were not properly cleared in the compiled expression tree. Index: BindingExpression.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/BindingExpression.java,v retrieving revision 1.15 retrieving revision 1.16 diff -C2 -d -r1.15 -r1.16 *** BindingExpression.java 11 Dec 2005 10:19:12 -0000 1.15 --- BindingExpression.java 30 Dec 2005 20:07:18 -0000 1.16 *************** *** 210,213 **** --- 210,214 ---- */ public void resetState() { + super.resetState(); inputSequence.resetState(); if(whereExpr != null) whereExpr.resetState(); Index: DynamicNameCheck.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/DynamicNameCheck.java,v retrieving revision 1.5 retrieving revision 1.6 diff -C2 -d -r1.5 -r1.6 *** DynamicNameCheck.java 11 Dec 2005 10:19:12 -0000 1.5 --- DynamicNameCheck.java 30 Dec 2005 20:07:18 -0000 1.6 *************** *** 23,26 **** --- 23,27 ---- package org.exist.xquery; + import org.exist.dom.DocumentSet; import org.exist.xquery.util.ExpressionDumper; import org.exist.xquery.value.Item; *************** *** 93,96 **** --- 94,98 ---- */ public void resetState() { + super.resetState(); expression.resetState(); } *************** *** 129,131 **** --- 131,137 ---- } + public void setContextDocSet(DocumentSet contextSet) { + super.setContextDocSet(contextSet); + expression.setContextDocSet(contextSet); + } } Index: Expression.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/Expression.java,v retrieving revision 1.9 retrieving revision 1.10 diff -C2 -d -r1.9 -r1.10 *** Expression.java 1 May 2005 11:13:09 -0000 1.9 --- Expression.java 30 Dec 2005 20:07:18 -0000 1.10 *************** *** 167,170 **** --- 167,172 ---- public void setContextDocSet(DocumentSet contextSet); + public DocumentSet getContextDocSet(); + /** * Returns the {@link XQueryAST} node from which this expression Index: CastExpression.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/CastExpression.java,v retrieving revision 1.9 retrieving revision 1.10 diff -C2 -d -r1.9 -r1.10 *** CastExpression.java 11 Dec 2005 10:19:12 -0000 1.9 --- CastExpression.java 30 Dec 2005 20:07:18 -0000 1.10 *************** *** 23,26 **** --- 23,27 ---- package org.exist.xquery; + import org.exist.dom.DocumentSet; import org.exist.xquery.util.ExpressionDumper; import org.exist.xquery.value.AtomicValue; *************** *** 148,155 **** --- 149,162 ---- } + public void setContextDocSet(DocumentSet contextSet) { + super.setContextDocSet(contextSet); + expression.setContextDocSet(contextSet); + } + /* (non-Javadoc) * @see org.exist.xquery.AbstractExpression#resetState() */ public void resetState() { + super.resetState(); expression.resetState(); } Index: DynamicTypeCheck.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/DynamicTypeCheck.java,v retrieving revision 1.9 retrieving revision 1.10 diff -C2 -d -r1.9 -r1.10 *** DynamicTypeCheck.java 7 Dec 2005 22:56:38 -0000 1.9 --- DynamicTypeCheck.java 30 Dec 2005 20:07:18 -0000 1.10 *************** *** 23,26 **** --- 23,27 ---- package org.exist.xquery; + import org.exist.dom.DocumentSet; import org.exist.xquery.parser.XQueryAST; import org.exist.xquery.util.ExpressionDumper; *************** *** 117,123 **** --- 118,130 ---- */ public void resetState() { + super.resetState(); expression.resetState(); } + public void setContextDocSet(DocumentSet contextSet) { + super.setContextDocSet(contextSet); + expression.setContextDocSet(contextSet); + } + /* (non-Javadoc) * @see org.exist.xquery.AbstractExpression#getASTNode() Index: GeneralComparison.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/GeneralComparison.java,v retrieving revision 1.51 retrieving revision 1.52 diff -C2 -d -r1.51 -r1.52 *** GeneralComparison.java 22 Dec 2005 15:04:12 -0000 1.51 --- GeneralComparison.java 30 Dec 2005 20:07:18 -0000 1.52 *************** *** 741,744 **** --- 741,745 ---- */ public void resetState() { + super.resetState(); getLeft().resetState(); getRight().resetState(); Index: CastableExpression.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/CastableExpression.java,v retrieving revision 1.6 retrieving revision 1.7 diff -C2 -d -r1.6 -r1.7 *** CastableExpression.java 20 Dec 2005 20:38:29 -0000 1.6 --- CastableExpression.java 30 Dec 2005 20:07:18 -0000 1.7 *************** *** 23,26 **** --- 23,27 ---- package org.exist.xquery; + import org.exist.dom.DocumentSet; import org.exist.xquery.util.ExpressionDumper; import org.exist.xquery.value.BooleanValue; *************** *** 134,138 **** --- 135,145 ---- } + public void setContextDocSet(DocumentSet contextSet) { + super.setContextDocSet(contextSet); + expression.setContextDocSet(contextSet); + } + public void resetState() { + super.resetState(); expression.resetState(); } Index: Atomize.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/Atomize.java,v retrieving revision 1.11 retrieving revision 1.12 diff -C2 -d -r1.11 -r1.12 *** Atomize.java 11 Dec 2005 10:19:12 -0000 1.11 --- Atomize.java 30 Dec 2005 20:07:18 -0000 1.12 *************** *** 23,26 **** --- 23,27 ---- package org.exist.xquery; + import org.exist.dom.DocumentSet; import org.exist.xquery.parser.XQueryAST; import org.exist.xquery.util.ExpressionDumper; *************** *** 114,121 **** --- 115,128 ---- } + public void setContextDocSet(DocumentSet contextSet) { + super.setContextDocSet(contextSet); + expression.setContextDocSet(contextSet); + } + /* (non-Javadoc) * @see org.exist.xquery.AbstractExpression#resetState() */ public void resetState() { + super.resetState(); expression.resetState(); } Index: AtomicToString.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/AtomicToString.java,v retrieving revision 1.4 retrieving revision 1.5 diff -C2 -d -r1.4 -r1.5 *** AtomicToString.java 29 Oct 2005 10:51:12 -0000 1.4 --- AtomicToString.java 30 Dec 2005 20:07:18 -0000 1.5 *************** *** 104,111 **** --- 104,117 ---- } + public void setContextDocSet(DocumentSet contextSet) { + super.setContextDocSet(contextSet); + expression.setContextDocSet(contextSet); + } + /* (non-Javadoc) * @see org.exist.xquery.AbstractExpression#resetState() */ public void resetState() { + super.resetState(); expression.resetState(); } Index: LocationStep.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/LocationStep.java,v retrieving revision 1.65 retrieving revision 1.66 diff -C2 -d -r1.65 -r1.66 *** LocationStep.java 20 Dec 2005 22:37:06 -0000 1.65 --- LocationStep.java 30 Dec 2005 20:07:18 -0000 1.66 *************** *** 51,54 **** --- 51,56 ---- public class LocationStep extends Step { + private final int ATTR_DIRECT_SELECT_THRESHOLD = 3; + protected NodeSet currentSet = null; protected DocumentSet currentDocs = null; *************** *** 124,127 **** --- 126,130 ---- //TODO : log and/or profile ? pred = (Predicate) i.next(); + pred.setContextDocSet(getContextDocSet()); result = pred.evalPredicate(outerSequence, result, axis); } *************** *** 311,320 **** // if there's just a single known node in the context, it is faster // do directly search for the attribute in the parent node. ! } else if(axis == Constants.ATTRIBUTE_AXIS && contextSet.getLength() == 1 && !(contextSet instanceof VirtualNodeSet)) { NodeProxy proxy = contextSet.get(0); ! if (proxy.getInternalAddress() != NodeProxy.UNKNOWN_NODE_ADDRESS) return contextSet.directSelectAttribute(test.getName(), inPredicate); ! } if (preloadNodeSets()) { DocumentSet docs = getDocumentSet(contextSet); --- 314,324 ---- // if there's just a single known node in the context, it is faster // do directly search for the attribute in the parent node. ! } else if( ! axis == Constants.ATTRIBUTE_AXIS && contextSet.getLength() < ATTR_DIRECT_SELECT_THRESHOLD && !(contextSet instanceof VirtualNodeSet)) { NodeProxy proxy = contextSet.get(0); ! if (proxy != null && proxy.getInternalAddress() != NodeProxy.UNKNOWN_NODE_ADDRESS) return contextSet.directSelectAttribute(test.getName(), inPredicate); ! } if (preloadNodeSets()) { DocumentSet docs = getDocumentSet(contextSet); *************** *** 326,330 **** //TODO : why a null selector here ? We have one below ! currentSet = index.findElementsByTagName(ElementValue.ATTRIBUTE, docs, test.getName(), null); ! currentDocs = docs; registerUpdateListener(); } --- 330,334 ---- //TODO : why a null selector here ? We have one below ! currentSet = index.findElementsByTagName(ElementValue.ATTRIBUTE, docs, test.getName(), null); ! currentDocs = docs; registerUpdateListener(); } *************** *** 355,360 **** context.getProfiler().message(this, Profiler.OPTIMIZATIONS, "OPTIMIZATION", "using index '" + index.toString() + "'"); ! return index.getAttributesByName(docs, test.getName(), selector); ! } } --- 359,364 ---- context.getProfiler().message(this, Profiler.OPTIMIZATIONS, "OPTIMIZATION", "using index '" + index.toString() + "'"); ! return index.getAttributesByName(docs, test.getName(), selector); ! } } *************** *** 365,369 **** vset.setInPredicate(inPredicate); return vset; ! } else if (preloadNodeSets()) { DocumentSet docs = getDocumentSet(contextSet); //TODO : understand why this one is different from the other ones --- 369,373 ---- vset.setInPredicate(inPredicate); return vset; ! } else if (preloadNodeSets()) { DocumentSet docs = getDocumentSet(contextSet); //TODO : understand why this one is different from the other ones Index: DynamicCardinalityCheck.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/DynamicCardinalityCheck.java,v retrieving revision 1.13 retrieving revision 1.14 diff -C2 -d -r1.13 -r1.14 *** DynamicCardinalityCheck.java 11 Dec 2005 10:19:12 -0000 1.13 --- DynamicCardinalityCheck.java 30 Dec 2005 20:07:18 -0000 1.14 *************** *** 23,26 **** --- 23,27 ---- package org.exist.xquery; + import org.exist.dom.DocumentSet; import org.exist.xquery.util.Error; import org.exist.xquery.util.ExpressionDumper; *************** *** 130,137 **** --- 131,144 ---- } + public void setContextDocSet(DocumentSet contextSet) { + super.setContextDocSet(contextSet); + expression.setContextDocSet(contextSet); + } + /* (non-Javadoc) * @see org.exist.xquery.AbstractExpression#resetState() */ public void resetState() { + super.resetState(); expression.resetState(); } Index: Predicate.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/Predicate.java,v retrieving revision 1.32 retrieving revision 1.33 diff -C2 -d -r1.32 -r1.33 *** Predicate.java 29 Dec 2005 15:05:15 -0000 1.32 --- Predicate.java 30 Dec 2005 20:07:18 -0000 1.33 *************** *** 27,30 **** --- 27,31 ---- import org.exist.dom.ContextItem; import org.exist.dom.DocumentImpl; + import org.exist.dom.DocumentSet; import org.exist.dom.ExtArrayNodeSet; import org.exist.dom.NodeProxy; *************** *** 338,341 **** --- 339,348 ---- } + public void setContextDocSet(DocumentSet contextSet) { + super.setContextDocSet(contextSet); + if (getLength() > 0) + getExpression(0).setContextDocSet(contextSet); + } + /* (non-Javadoc) * @see org.exist.xquery.PathExpr#resetState() Index: AbstractExpression.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/AbstractExpression.java,v retrieving revision 1.7 retrieving revision 1.8 diff -C2 -d -r1.7 -r1.8 *** AbstractExpression.java 16 Nov 2004 15:47:05 -0000 1.7 --- AbstractExpression.java 30 Dec 2005 20:07:18 -0000 1.8 *************** *** 60,64 **** * @see org.exist.xquery.Expression#resetState() */ ! public abstract void resetState(); /** --- 60,66 ---- * @see org.exist.xquery.Expression#resetState() */ ! public void resetState() { ! contextDocSet = null; ! } /** Index: VariableDeclaration.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/VariableDeclaration.java,v retrieving revision 1.11 retrieving revision 1.12 diff -C2 -d -r1.11 -r1.12 *** VariableDeclaration.java 7 Dec 2005 21:20:46 -0000 1.11 --- VariableDeclaration.java 30 Dec 2005 20:07:18 -0000 1.12 *************** *** 172,175 **** --- 172,176 ---- */ public void resetState() { + super.resetState(); expression.resetState(); } Index: ConditionalExpression.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/ConditionalExpression.java,v retrieving revision 1.6 retrieving revision 1.7 diff -C2 -d -r1.6 -r1.7 *** ConditionalExpression.java 29 Oct 2005 10:51:11 -0000 1.6 --- ConditionalExpression.java 30 Dec 2005 20:07:18 -0000 1.7 *************** *** 142,145 **** --- 142,146 ---- */ public void resetState() { + super.resetState(); testExpr.resetState(); thenExpr.resetState(); Index: PathExpr.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/PathExpr.java,v retrieving revision 1.27 retrieving revision 1.28 diff -C2 -d -r1.27 -r1.28 *** PathExpr.java 27 Dec 2005 15:32:37 -0000 1.27 --- PathExpr.java 30 Dec 2005 20:07:18 -0000 1.28 *************** *** 144,147 **** --- 144,150 ---- } + if (contextDocs != null) + setContextDocSet(contextDocs); + Item current; Sequence values; *************** *** 324,327 **** --- 327,331 ---- */ public void resetState() { + super.resetState(); for (Iterator i = steps.iterator(); i.hasNext();) { ((Expression)i.next()).resetState(); Index: InstanceOfExpression.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/InstanceOfExpression.java,v retrieving revision 1.6 retrieving revision 1.7 diff -C2 -d -r1.6 -r1.7 *** InstanceOfExpression.java 11 Dec 2005 10:19:12 -0000 1.6 --- InstanceOfExpression.java 30 Dec 2005 20:07:18 -0000 1.7 *************** *** 23,26 **** --- 23,27 ---- package org.exist.xquery; + import org.exist.dom.DocumentSet; import org.exist.xquery.util.ExpressionDumper; import org.exist.xquery.value.BooleanValue; *************** *** 140,145 **** --- 141,151 ---- */ public void resetState() { + super.resetState(); expression.resetState(); } + public void setContextDocSet(DocumentSet contextSet) { + super.setContextDocSet(contextSet); + expression.setContextDocSet(contextSet); + } } Index: AttributeConstructor.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/AttributeConstructor.java,v retrieving revision 1.8 retrieving revision 1.9 diff -C2 -d -r1.8 -r1.9 *** AttributeConstructor.java 21 Dec 2005 17:07:34 -0000 1.8 --- AttributeConstructor.java 30 Dec 2005 20:07:18 -0000 1.9 *************** *** 171,174 **** --- 171,175 ---- */ public void resetState() { + super.resetState(); Object object; for(Iterator i = contents.iterator(); i.hasNext(); ) { Index: LetExpr.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/LetExpr.java,v retrieving revision 1.20 retrieving revision 1.21 diff -C2 -d -r1.20 -r1.21 *** LetExpr.java 11 Dec 2005 10:19:12 -0000 1.20 --- LetExpr.java 30 Dec 2005 20:07:18 -0000 1.21 *************** *** 103,106 **** --- 103,107 ---- context.declareVariableBinding(var); var.setValue(in); + var.setContextDocs(inputSequence.getContextDocSet()); var.checkType(); Index: UntypedValueCheck.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/UntypedValueCheck.java,v retrieving revision 1.12 retrieving revision 1.13 diff -C2 -d -r1.12 -r1.13 *** UntypedValueCheck.java 11 Dec 2005 10:19:12 -0000 1.12 --- UntypedValueCheck.java 30 Dec 2005 20:07:18 -0000 1.13 *************** *** 145,151 **** --- 145,157 ---- */ public void resetState() { + super.resetState(); expression.resetState(); } + public void setContextDocSet(DocumentSet contextSet) { + super.setContextDocSet(contextSet); + expression.setContextDocSet(contextSet); + } + /* (non-Javadoc) * @see org.exist.xquery.AbstractExpression#getASTNode() Index: CombiningExpression.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/CombiningExpression.java,v retrieving revision 1.5 retrieving revision 1.6 diff -C2 -d -r1.5 -r1.6 *** CombiningExpression.java 16 Nov 2004 15:47:05 -0000 1.5 --- CombiningExpression.java 30 Dec 2005 20:07:18 -0000 1.6 *************** *** 23,26 **** --- 23,27 ---- package org.exist.xquery; + import org.exist.dom.DocumentSet; import org.exist.xquery.value.Item; import org.exist.xquery.value.Sequence; *************** *** 68,75 **** --- 69,83 ---- } + public void setContextDocSet(DocumentSet contextSet) { + super.setContextDocSet(contextSet); + left.setContextDocSet(contextSet); + right.setContextDocSet(contextSet); + } + /* (non-Javadoc) * @see org.exist.xquery.Expression#resetState() */ public void resetState() { + super.resetState(); left.resetState(); right.resetState(); Index: UserDefinedFunction.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/UserDefinedFunction.java,v retrieving revision 1.11 retrieving revision 1.12 diff -C2 -d -r1.11 -r1.12 *** UserDefinedFunction.java 26 Nov 2005 17:19:30 -0000 1.11 --- UserDefinedFunction.java 30 Dec 2005 20:07:18 -0000 1.12 *************** *** 162,165 **** --- 162,166 ---- public void resetState() { //TODO ; understand this test. Why not reset even is not in recursion ? + // Answer: would lead to an infinite loop if the function is recursive. if(!inRecursion) { inRecursion = true; Index: SimpleStep.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/SimpleStep.java,v retrieving revision 1.8 retrieving revision 1.9 diff -C2 -d -r1.8 -r1.9 *** SimpleStep.java 11 Dec 2005 10:19:12 -0000 1.8 --- SimpleStep.java 30 Dec 2005 20:07:18 -0000 1.9 *************** *** 23,26 **** --- 23,27 ---- package org.exist.xquery; + import org.exist.dom.DocumentSet; import org.exist.dom.NodeSet; import org.exist.xquery.value.Item; *************** *** 105,108 **** --- 106,114 ---- } + public void setContextDocSet(DocumentSet contextSet) { + super.setContextDocSet(contextSet); + expression.setContextDocSet(contextSet); + } + /* (non-Javadoc) * @see org.exist.xquery.AbstractExpression#setPrimaryAxis(int) Index: Step.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/Step.java,v retrieving revision 1.6 retrieving revision 1.7 diff -C2 -d -r1.6 -r1.7 *** Step.java 26 Nov 2005 15:10:21 -0000 1.6 --- Step.java 30 Dec 2005 20:07:18 -0000 1.7 *************** *** 131,134 **** --- 131,135 ---- */ public void resetState() { + super.resetState(); for (Iterator i = predicates.iterator(); i.hasNext();) { Predicate pred = (Predicate) i.next(); Index: BinaryOp.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/BinaryOp.java,v retrieving revision 1.6 retrieving revision 1.7 diff -C2 -d -r1.6 -r1.7 *** BinaryOp.java 5 Jan 2005 07:43:20 -0000 1.6 --- BinaryOp.java 30 Dec 2005 20:07:18 -0000 1.7 *************** *** 20,23 **** --- 20,24 ---- package org.exist.xquery; + import org.exist.dom.DocumentSet; import org.exist.xquery.value.Item; import org.exist.xquery.value.Sequence; *************** *** 52,55 **** --- 53,62 ---- } + public void setContextDocSet(DocumentSet contextSet) { + super.setContextDocSet(contextSet); + getLeft().setContextDocSet(contextSet); + getRight().setContextDocSet(contextSet); + } + /* * (non-Javadoc) Index: NodeConstructor.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/NodeConstructor.java,v retrieving revision 1.3 retrieving revision 1.4 diff -C2 -d -r1.3 -r1.4 *** NodeConstructor.java 14 Nov 2004 22:15:11 -0000 1.3 --- NodeConstructor.java 30 Dec 2005 20:07:18 -0000 1.4 *************** *** 70,73 **** --- 70,74 ---- */ public void resetState() { + super.resetState(); } } Index: FilteredExpression.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/FilteredExpression.java,v retrieving revision 1.7 retrieving revision 1.8 diff -C2 -d -r1.7 -r1.8 *** FilteredExpression.java 11 Dec 2005 10:19:12 -0000 1.7 --- FilteredExpression.java 30 Dec 2005 20:07:18 -0000 1.8 *************** *** 136,139 **** --- 136,140 ---- */ public void resetState() { + super.resetState(); expression.resetState(); for (Iterator i = predicates.iterator(); i.hasNext();) { Index: FunctionCall.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/FunctionCall.java,v retrieving revision 1.19 retrieving revision 1.20 diff -C2 -d -r1.19 -r1.20 *** FunctionCall.java 6 Dec 2005 17:50:29 -0000 1.19 --- FunctionCall.java 30 Dec 2005 20:07:18 -0000 1.20 *************** *** 180,183 **** --- 180,184 ---- */ public void resetState() { + super.resetState(); functionDef.resetState(); //TODO : reset expression ? Index: SequenceConstructor.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/SequenceConstructor.java,v retrieving revision 1.8 retrieving revision 1.9 diff -C2 -d -r1.8 -r1.9 *** SequenceConstructor.java 4 Nov 2005 17:27:24 -0000 1.8 --- SequenceConstructor.java 30 Dec 2005 20:07:18 -0000 1.9 *************** *** 108,111 **** --- 108,112 ---- */ public void resetState() { + super.resetState(); for (Iterator i = steps.iterator(); i.hasNext();) { ((Expression) i.next()).resetState(); Index: ElementConstructor.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/ElementConstructor.java,v retrieving revision 1.20 retrieving revision 1.21 diff -C2 -d -r1.20 -r1.21 *** ElementConstructor.java 28 Dec 2005 18:10:14 -0000 1.20 --- ElementConstructor.java 30 Dec 2005 20:07:18 -0000 1.21 *************** *** 29,32 **** --- 29,33 ---- import org.exist.memtree.MemTreeBuilder; import org.exist.memtree.NodeImpl; + import org.exist.util.sanity.SanityCheck; import org.exist.xquery.util.ExpressionDumper; import org.exist.xquery.value.Item; *************** *** 253,256 **** --- 254,258 ---- */ public void resetState() { + super.resetState(); if(content != null) content.resetState(); |
|
From: Pierrick B. <br...@us...> - 2005-12-30 19:54:11
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/storage In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv28803/src/org/exist/storage Modified Files: NativeTextEngine.java NativeValueIndex.java Log Message: Code cleaning, comments... Index: NativeTextEngine.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/storage/NativeTextEngine.java,v retrieving revision 1.117 retrieving revision 1.118 diff -C2 -d -r1.117 -r1.118 *** NativeTextEngine.java 30 Dec 2005 19:29:41 -0000 1.117 --- NativeTextEngine.java 30 Dec 2005 19:54:02 -0000 1.118 *************** *** 950,955 **** size = is.readFixedInt(); if (storedSection != currentSection || storedDocId != this.doc.getDocId()) { ! // data are related to another section or document: ! // append them to any existing data os.writeInt(storedDocId); os.writeByte(storedSection); --- 950,955 ---- size = is.readFixedInt(); if (storedSection != currentSection || storedDocId != this.doc.getDocId()) { ! // data are related to another section or document: ! // append them to any existing data os.writeInt(storedDocId); os.writeByte(storedSection); *************** *** 958,962 **** is.copyRaw(os, size); } else { ! // data are related to our section and document: // feed the new list with the GIDs previousGID = 0; --- 958,962 ---- is.copyRaw(os, size); } else { ! // data are related to our section and document: // feed the new list with the GIDs previousGID = 0; *************** *** 1065,1068 **** --- 1065,1069 ---- for (byte currentSection = 0; currentSection <= ATTRIBUTE_SECTION; currentSection++) { for (Iterator i = words[currentSection].entrySet().iterator(); i.hasNext();) { + //Compute a key for the token entry = (Map.Entry) i.next(); token = (String) entry.getKey(); *************** *** 1072,1078 **** try { lock.acquire(Lock.WRITE_LOCK); ! is = dbTokens.getAsStream(ref); if (is != null) { ! // add old entries to the new list try { while (is.available() > 0) { --- 1073,1080 ---- try { lock.acquire(Lock.WRITE_LOCK); ! is = dbTokens.getAsStream(ref); ! //Does the token already has data in the index ? if (is != null) { ! //Add its data to the new list try { while (is.available() > 0) { *************** *** 1082,1087 **** size = is.readFixedInt(); if (storedSection != currentSection || storedDocId != document.getDocId()) { ! // section belongs to another document: ! // copy data to new buffer os.writeInt(storedDocId); os.writeByte(storedSection); --- 1084,1089 ---- size = is.readFixedInt(); if (storedSection != currentSection || storedDocId != document.getDocId()) { ! // data are related to another section or document: ! // append them to any existing data os.writeInt(storedDocId); os.writeByte(storedSection); *************** *** 1090,1094 **** is.copyRaw(os, size); } else { ! // copy nodes to new list previousGID = 0; for (int j = 0; j < termCount; j++) { --- 1092,1097 ---- is.copyRaw(os, size); } else { ! // data are related to our section and document: ! // feed the new list with the GIDs previousGID = 0; for (int j = 0; j < termCount; j++) { *************** *** 1096,1116 **** storedGID = previousGID + delta; freq = is.readInt(); ! if (node == null ! && document.getTreeLevel(storedGID) < document ! .reindexRequired()) { ! for (int l = 0; l < freq; l++) { ! storedOccurencesList.add(storedGID, is.readInt()); ! } ! } else if (node != null ! && (!XMLUtil ! .isDescendantOrSelf( ! document, ! node.getGID(), ! storedGID))) { ! for (int l = 0; l < freq; l++) { ! storedOccurencesList.add(storedGID, is.readInt()); ! } ! } else ! is.skip(freq); previousGID = storedGID; } --- 1099,1121 ---- storedGID = previousGID + delta; freq = is.readInt(); ! if (node == null) { ! if (document.getTreeLevel(storedGID) < document.reindexRequired()) { ! for (int l = 0; l < freq; l++) { ! storedOccurencesList.add(storedGID, is.readInt()); ! } ! } else ! is.skip(freq); ! ! } else { ! if (!XMLUtil.isDescendantOrSelf( ! document, ! node.getGID(), ! storedGID)) { ! for (int l = 0; l < freq; l++) { ! storedOccurencesList.add(storedGID, is.readInt()); ! } ! } else ! is.skip(freq); ! } previousGID = storedGID; } Index: NativeValueIndex.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/storage/NativeValueIndex.java,v retrieving revision 1.53 retrieving revision 1.54 diff -C2 -d -r1.53 -r1.54 *** NativeValueIndex.java 30 Dec 2005 19:29:41 -0000 1.53 --- NativeValueIndex.java 30 Dec 2005 19:54:02 -0000 1.54 *************** *** 448,454 **** LOG.warn(e.getMessage(), e); } catch (BTreeException e) { ! LOG.warn(e.getMessage(), e); } catch (IOException e) { ! LOG.warn(e.getMessage(), e); } finally { lock.release(); --- 448,454 ---- LOG.warn(e.getMessage(), e); } catch (BTreeException e) { ! LOG.error(e.getMessage(), e); } catch (IOException e) { ! LOG.error(e.getMessage(), e); } finally { lock.release(); *************** *** 480,488 **** final Lock lock = dbValues.getLock(); for (Iterator i = pending.entrySet().iterator(); i.hasNext();) { entry = (Map.Entry) i.next(); indexable = (Indexable) entry.getKey(); storedGIDList = (LongLinkedList) entry.getValue(); ! ref = new Value(indexable.serialize(collectionId, caseSensitive)); ! // Retrieve old index entry for the element try { lock.acquire(Lock.WRITE_LOCK); --- 480,488 ---- final Lock lock = dbValues.getLock(); for (Iterator i = pending.entrySet().iterator(); i.hasNext();) { + //Compute a key for the value entry = (Map.Entry) i.next(); indexable = (Indexable) entry.getKey(); storedGIDList = (LongLinkedList) entry.getValue(); ! ref = new Value(indexable.serialize(collectionId, caseSensitive)); try { lock.acquire(Lock.WRITE_LOCK); *************** *** 490,493 **** --- 490,494 ---- os.clear(); newGIDList = new LongLinkedList(); + //Does the value already has data in the index ? if (is != null) { try { *************** *** 521,529 **** } } catch (EOFException e) { ! //Is it expected ? -pb LOG.warn(e.getMessage(), e); - } catch (IOException e) { - LOG.error(e.getMessage(), e); - //TODO : data will be saved although os is probably corrupted ! -pb } } --- 522,527 ---- } } catch (EOFException e) { ! //Is it expected ? Remove this block if not -pb LOG.warn(e.getMessage(), e); } } |
|
From: Pierrick B. <br...@us...> - 2005-12-30 19:29:50
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/storage In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv24606/src/org/exist/storage Modified Files: NativeTextEngine.java NativeElementIndex.java NativeValueIndex.java Log Message: Code cleaning, comments... Index: NativeElementIndex.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/storage/NativeElementIndex.java,v retrieving revision 1.70 retrieving revision 1.71 diff -C2 -d -r1.70 -r1.71 *** NativeElementIndex.java 30 Dec 2005 18:44:45 -0000 1.70 --- NativeElementIndex.java 30 Dec 2005 19:29:41 -0000 1.71 *************** *** 218,222 **** long delta; Map.Entry entry; ! Value ref; Value value; VariableByteArrayInput is; --- 218,222 ---- long delta; Map.Entry entry; ! Value searchKey; Value value; VariableByteArrayInput is; *************** *** 228,247 **** final Lock lock = dbNodes.getLock(); for (Iterator i = pending.entrySet().iterator(); i.hasNext();) { try { lock.acquire(Lock.WRITE_LOCK); ! newGIDList = new ArrayList(); ! entry = (Map.Entry) i.next(); ! storedGIDList = (ArrayList) entry.getValue(); ! qname = (QName) entry.getKey(); ! //Compute a key for the node ! if (qname.getNameType() == ElementValue.ATTRIBUTE_ID) { ! ref = new ElementValue(qname.getNameType(), collectionId, qname.getLocalName()); ! } else { ! short sym = broker.getSymbols().getSymbol(qname.getLocalName()); ! short nsSym = broker.getSymbols().getNSSymbol(qname.getNamespaceURI()); ! ref = new ElementValue(qname.getNameType(), collectionId, sym, nsSym); ! } ! value = dbNodes.get(ref); ! os.clear(); //Does the node already exist in the index ? if (value != null) { --- 228,247 ---- final Lock lock = dbNodes.getLock(); for (Iterator i = pending.entrySet().iterator(); i.hasNext();) { + entry = (Map.Entry) i.next(); + storedGIDList = (ArrayList) entry.getValue(); + qname = (QName) entry.getKey(); + //Compute a key for the node + if (qname.getNameType() == ElementValue.ATTRIBUTE_ID) { + searchKey = new ElementValue(qname.getNameType(), collectionId, qname.getLocalName()); + } else { + short sym = broker.getSymbols().getSymbol(qname.getLocalName()); + short nsSym = broker.getSymbols().getNSSymbol(qname.getNamespaceURI()); + searchKey = new ElementValue(qname.getNameType(), collectionId, sym, nsSym); + } + newGIDList = new ArrayList(); + os.clear(); try { lock.acquire(Lock.WRITE_LOCK); ! value = dbNodes.get(searchKey); //Does the node already exist in the index ? if (value != null) { *************** *** 310,327 **** //Store the data if (value == null) { ! if (dbNodes.put(ref, os.data()) == BFile.UNKNOWN_ADDRESS) { ! LOG.warn("Could not put index data for node '" + qname + "'"); } } else { ! if (dbNodes.update(value.getAddress(), ref, os.data()) == BFile.UNKNOWN_ADDRESS) { ! LOG.warn("Could not put index data for node '" + qname + "'"); } } } catch (LockException e) { ! LOG.warn("Failed to acquire lock for '" + dbNodes.getFile().getName() + "'", e); ! //TODO : return ? } catch (ReadOnlyException e) { ! LOG.warn("Read-only error on '" + dbNodes.getFile().getName() + "'", e); ! //TODO : return ? } finally { lock.release(); --- 310,325 ---- //Store the data if (value == null) { ! if (dbNodes.put(searchKey, os.data()) == BFile.UNKNOWN_ADDRESS) { ! LOG.error("Could not put index data for node '" + qname + "'"); } } else { ! if (dbNodes.update(value.getAddress(), searchKey, os.data()) == BFile.UNKNOWN_ADDRESS) { ! LOG.error("Could not put index data for node '" + qname + "'"); } } } catch (LockException e) { ! LOG.warn("Failed to acquire lock for '" + dbNodes.getFile().getName() + "'", e); } catch (ReadOnlyException e) { ! LOG.warn("Read-only error on '" + dbNodes.getFile().getName() + "'", e); } finally { lock.release(); Index: NativeTextEngine.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/storage/NativeTextEngine.java,v retrieving revision 1.116 retrieving revision 1.117 diff -C2 -d -r1.116 -r1.117 *** NativeTextEngine.java 30 Dec 2005 18:44:45 -0000 1.116 --- NativeTextEngine.java 30 Dec 2005 19:29:41 -0000 1.117 *************** *** 824,829 **** String token; int count = 0; ! for (int section = 0; section <= ATTRIBUTE_SECTION; section++) { ! for (Iterator i = words[section].entrySet().iterator(); i.hasNext(); count++) { entry = (Map.Entry) i.next(); token = (String) entry.getKey(); --- 824,829 ---- String token; int count = 0; ! for (byte currentSection = 0; currentSection <= ATTRIBUTE_SECTION; currentSection++) { ! for (Iterator i = words[currentSection].entrySet().iterator(); i.hasNext(); count++) { entry = (Map.Entry) i.next(); token = (String) entry.getKey(); *************** *** 834,838 **** os.clear(); os.writeInt(this.doc.getDocId()); ! switch (section) { case 0 : os.writeByte(TEXT_SECTION); --- 834,838 ---- os.clear(); os.writeInt(this.doc.getDocId()); ! switch (currentSection) { case 0 : os.writeByte(TEXT_SECTION); *************** *** 876,880 **** notifyObservers(progress); } ! words[section].clear(); } } --- 876,880 ---- notifyObservers(progress); } ! words[currentSection].clear(); } } *************** *** 927,942 **** final short collectionId = this.doc.getCollection().getId(); final Lock lock = dbTokens.getLock(); ! for (byte currentSection = 0; currentSection < 2; currentSection++) { for (Iterator i = words[currentSection].entrySet().iterator(); i.hasNext();) { try { ! lock.acquire(Lock.WRITE_LOCK); ! newOccurencesList = new OccurrenceList(); ! //Compute a key for the token ! entry = (Map.Entry) i.next(); ! storedOccurencesList = (OccurrenceList) entry.getValue(); ! token = (String) entry.getKey(); ! ref = new WordRef(collectionId, token); value = dbTokens.get(ref); - os.clear(); //Does the token already exist in the index ? if (value != null) { --- 927,942 ---- final short collectionId = this.doc.getCollection().getId(); final Lock lock = dbTokens.getLock(); ! for (byte currentSection = 0; currentSection <= ATTRIBUTE_SECTION; currentSection++) { for (Iterator i = words[currentSection].entrySet().iterator(); i.hasNext();) { + //Compute a key for the token + entry = (Map.Entry) i.next(); + storedOccurencesList = (OccurrenceList) entry.getValue(); + token = (String) entry.getKey(); + ref = new WordRef(collectionId, token); + newOccurencesList = new OccurrenceList(); + os.clear(); try { ! lock.acquire(Lock.WRITE_LOCK); value = dbTokens.get(ref); //Does the token already exist in the index ? if (value != null) { *************** *** 1022,1033 **** if(os.data().size() == 0) { dbTokens.remove(ref); ! } else { if (value == null) if (dbTokens.put(ref, os.data()) == BFile.UNKNOWN_ADDRESS) { ! LOG.warn("Could not put index data for token '" + ref + "'"); } else if (dbTokens.update(value.getAddress(), ref, os.data()) == BFile.UNKNOWN_ADDRESS) { ! LOG.warn("Could not update index data for token '" + ref + "'"); } } --- 1022,1034 ---- if(os.data().size() == 0) { dbTokens.remove(ref); ! } else { if (value == null) + //TOUNDERSTAND : is this ever called ? -pb if (dbTokens.put(ref, os.data()) == BFile.UNKNOWN_ADDRESS) { ! LOG.error("Could not put index data for token '" + token + "'"); } else if (dbTokens.update(value.getAddress(), ref, os.data()) == BFile.UNKNOWN_ADDRESS) { ! LOG.error("Could not update index data for token '" + token + "'"); } } *************** *** 1035,1040 **** LOG.warn("Failed to acquire lock for '" + dbTokens.getFile().getName() + "'", e); } catch (ReadOnlyException e) { ! LOG.warn("Read-only error on '" + dbTokens.getFile().getName() + "'", e); ! return; } finally { lock.release(); --- 1036,1040 ---- LOG.warn("Failed to acquire lock for '" + dbTokens.getFile().getName() + "'", e); } catch (ReadOnlyException e) { ! LOG.warn("Read-only error on '" + dbTokens.getFile().getName() + "'", e); } finally { lock.release(); *************** *** 1045,1098 **** } ! public void reindex(DocumentImpl document, NodeImpl node) { ! final short collectionId = document.getCollection().getId(); ! int len, rawSize, docId; Map.Entry entry; ! String word; ! OccurrenceList idList; ! long last, gid, delta; ! int freq = 1; ! byte section; ! // NodeProxy p; ! WordRef ref; ! VariableByteInput is = null; ! Lock lock = dbTokens.getLock(); ! for (int k = 0; k < 2; k++) { ! for (Iterator i = words[k].entrySet().iterator(); i.hasNext();) { entry = (Map.Entry) i.next(); ! word = (String) entry.getKey(); ! idList = (OccurrenceList) entry.getValue(); ! ref = new WordRef(collectionId, word); try { lock.acquire(Lock.WRITE_LOCK); ! is = dbTokens.getAsStream(ref); ! os.clear(); if (is != null) { // add old entries to the new list try { while (is.available() > 0) { ! docId = is.readInt(); ! section = is.readByte(); ! len = is.readInt(); ! rawSize = is.readFixedInt(); ! if (docId != document.getDocId() || section != k) { // section belongs to another document: // copy data to new buffer ! os.writeInt(docId); ! os.writeByte(section); ! os.writeInt(len); ! os.writeFixedInt(rawSize); ! is.copyRaw(os, rawSize); } else { // copy nodes to new list ! gid = 0; ! for (int j = 0; j < len; j++) { ! gid += is.readLong(); freq = is.readInt(); if (node == null ! && document.getTreeLevel(gid) < document .reindexRequired()) { for (int l = 0; l < freq; l++) { ! idList.add(gid, is.readInt()); } } else if (node != null --- 1045,1104 ---- } ! public void reindex(DocumentImpl document, NodeImpl node) { ! OccurrenceList storedOccurencesList; ! int termCount; ! long storedGID; ! long previousGID; ! long delta; Map.Entry entry; ! String token; ! WordRef ref; ! VariableByteInput is; ! //TOUNDERSTAND -pb ! int size; ! int lenOffset; ! int storedDocId; ! byte storedSection; ! int freq; ! final short collectionId = document.getCollection().getId(); ! final Lock lock = dbTokens.getLock(); ! for (byte currentSection = 0; currentSection <= ATTRIBUTE_SECTION; currentSection++) { ! for (Iterator i = words[currentSection].entrySet().iterator(); i.hasNext();) { entry = (Map.Entry) i.next(); ! token = (String) entry.getKey(); ! storedOccurencesList = (OccurrenceList) entry.getValue(); ! ref = new WordRef(collectionId, token); ! os.clear(); try { lock.acquire(Lock.WRITE_LOCK); ! is = dbTokens.getAsStream(ref); if (is != null) { // add old entries to the new list try { while (is.available() > 0) { ! storedDocId = is.readInt(); ! storedSection = is.readByte(); ! termCount = is.readInt(); ! size = is.readFixedInt(); ! if (storedSection != currentSection || storedDocId != document.getDocId()) { // section belongs to another document: // copy data to new buffer ! os.writeInt(storedDocId); ! os.writeByte(storedSection); ! os.writeInt(termCount); ! os.writeFixedInt(size); ! is.copyRaw(os, size); } else { // copy nodes to new list ! previousGID = 0; ! for (int j = 0; j < termCount; j++) { ! delta = is.readLong(); ! storedGID = previousGID + delta; freq = is.readInt(); if (node == null ! && document.getTreeLevel(storedGID) < document .reindexRequired()) { for (int l = 0; l < freq; l++) { ! storedOccurencesList.add(storedGID, is.readInt()); } } else if (node != null *************** *** 1101,1110 **** document, node.getGID(), ! gid))) { for (int l = 0; l < freq; l++) { ! idList.add(gid, is.readInt()); } } else is.skip(freq); } } --- 1107,1117 ---- document, node.getGID(), ! storedGID))) { for (int l = 0; l < freq; l++) { ! storedOccurencesList.add(storedGID, is.readInt()); } } else is.skip(freq); + previousGID = storedGID; } } *************** *** 1115,1143 **** } catch (IOException e) { LOG.error("io-error while reading index entry for " ! + word, e); } } ! idList.sort(); ! len = idList.getTermCount(); os.writeInt(document.getDocId()); ! os.writeByte(k == 0 ? TEXT_SECTION : ATTRIBUTE_SECTION); ! os.writeInt(len); ! rawSize = os.position(); os.writeFixedInt(0); ! last = 0; ! for (int m = 0; m < idList.getSize(); ) { ! delta = idList.nodes[m] - last; ! os.writeLong(delta); ! last = idList.nodes[m]; ! freq = idList.getOccurrences(m); os.writeInt(freq); for (int n = 0; n < freq; n++) { ! os.writeInt(idList.offsets[m + n]); } m += freq; ! } ! ! os.writeFixedInt(rawSize, os.position() - rawSize - 4); try { --- 1122,1158 ---- } catch (IOException e) { LOG.error("io-error while reading index entry for " ! + token, e); } } ! termCount = storedOccurencesList.getTermCount(); ! storedOccurencesList.sort(); os.writeInt(document.getDocId()); ! switch (currentSection) { ! case 0 : ! os.writeByte(TEXT_SECTION); ! break; ! case 1 : ! os.writeByte(ATTRIBUTE_SECTION); ! break; ! default : ! throw new IllegalArgumentException("Invalid inverted index"); ! } ! os.writeInt(termCount); ! lenOffset = os.position(); os.writeFixedInt(0); ! previousGID = 0; ! for (int m = 0; m < storedOccurencesList.getSize(); ) { ! delta = storedOccurencesList.nodes[m] - previousGID; ! os.writeLong(delta); ! freq = storedOccurencesList.getOccurrences(m); os.writeInt(freq); for (int n = 0; n < freq; n++) { ! os.writeInt(storedOccurencesList.offsets[m + n]); } + previousGID = storedOccurencesList.nodes[m]; m += freq; ! } ! os.writeFixedInt(lenOffset, os.position() - lenOffset - 4); try { *************** *** 1149,1152 **** --- 1164,1168 ---- } } catch (ReadOnlyException e) { + } } catch (LockException e) { *************** *** 1154,1159 **** is = null; } catch (IOException e) { ! LOG.error("io error while reindexing word '" + word ! + "'"); is = null; } finally { --- 1170,1174 ---- is = null; } catch (IOException e) { ! LOG.error("io error while reindexing word '" + token + "'"); is = null; } finally { *************** *** 1161,1165 **** } } ! words[k].clear(); } } --- 1176,1180 ---- } } ! words[currentSection].clear(); } } Index: NativeValueIndex.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/storage/NativeValueIndex.java,v retrieving revision 1.52 retrieving revision 1.53 diff -C2 -d -r1.52 -r1.53 *** NativeValueIndex.java 30 Dec 2005 18:44:45 -0000 1.52 --- NativeValueIndex.java 30 Dec 2005 19:29:41 -0000 1.53 *************** *** 273,277 **** long delta; Map.Entry entry; ! Value ref; Value value; VariableByteArrayInput is; --- 273,277 ---- long delta; Map.Entry entry; ! Value searchKey; Value value; VariableByteArrayInput is; *************** *** 280,293 **** final Lock lock = dbValues.getLock(); for (Iterator i = pending.entrySet().iterator(); i.hasNext();) { try { ! lock.acquire(Lock.WRITE_LOCK); ! newGIDList = new LongLinkedList(); ! entry = (Map.Entry) i.next(); ! indexable = (Indexable) entry.getKey(); ! storedGIDList = (LongLinkedList) entry.getValue(); ! //Compute a key for the value ! ref = new Value(indexable.serialize(collectionId, caseSensitive)); ! value = dbValues.get(ref); ! os.clear(); //Does the value already exist in the index ? if (value != null) { --- 280,293 ---- final Lock lock = dbValues.getLock(); for (Iterator i = pending.entrySet().iterator(); i.hasNext();) { + entry = (Map.Entry) i.next(); + indexable = (Indexable) entry.getKey(); + storedGIDList = (LongLinkedList) entry.getValue(); + //Compute a key for the value + searchKey = new Value(indexable.serialize(collectionId, caseSensitive)); + newGIDList = new LongLinkedList(); + os.clear(); try { ! lock.acquire(Lock.WRITE_LOCK); ! value = dbValues.get(searchKey); //Does the value already exist in the index ? if (value != null) { *************** *** 345,354 **** //Store the data if (value == null) { ! if (dbValues.put(ref, os.data()) == BFile.UNKNOWN_ADDRESS) { ! LOG.warn("Could not put index data for value '" + ref + "'"); } } else { ! if (dbValues.update(value.getAddress(), ref, os.data()) == BFile.UNKNOWN_ADDRESS) { ! LOG.warn("Could not update index data for value '" + ref + "'"); } } --- 345,354 ---- //Store the data if (value == null) { ! if (dbValues.put(searchKey, os.data()) == BFile.UNKNOWN_ADDRESS) { ! LOG.error("Could not put index data for value '" + searchKey + "'"); } } else { ! if (dbValues.update(value.getAddress(), searchKey, os.data()) == BFile.UNKNOWN_ADDRESS) { ! LOG.error("Could not update index data for value '" + searchKey + "'"); } } *************** *** 357,362 **** //TODO : return ? } catch (ReadOnlyException e) { ! LOG.warn("Read-only error on '" + dbValues.getFile().getName() + "'", e); ! return; } finally { lock.release(); --- 357,361 ---- //TODO : return ? } catch (ReadOnlyException e) { ! LOG.warn("Read-only error on '" + dbValues.getFile().getName() + "'", e); } finally { lock.release(); |
|
From: Wolfgang M. M. <wol...@us...> - 2005-12-30 18:56:51
|
Update of /cvsroot/exist/eXist-1.0/webapp/irclog/scripts In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv16054/webapp/irclog/scripts Modified Files: irclog.js Log Message: Fixed previous/next link when browsing dates. Index: irclog.js =================================================================== RCS file: /cvsroot/exist/eXist-1.0/webapp/irclog/scripts/irclog.js,v retrieving revision 1.2 retrieving revision 1.3 diff -C2 -d -r1.2 -r1.3 *** irclog.js 30 Dec 2005 16:42:30 -0000 1.2 --- irclog.js 30 Dec 2005 18:56:43 -0000 1.3 *************** *** 59,63 **** inputField : 'current-date', ifFormat : '%Y-%m-%d', ! button : 'set-date' } ); --- 59,66 ---- inputField : 'current-date', ifFormat : '%Y-%m-%d', ! button : 'set-date', ! onUpdate : function (calendar) { ! currentDate = calendar.date; ! } } ); |