Tuesday, February 28, 2006

du creation for Etc and Home completed

Its a small milestone, but AgileWiki3 is now generating the same descriptor units as AW2 does for Etc and Home--not that AW3 has any code that uses them! But it is nice to see the generated data. It exercises the display logic, if nothing else. :-)

I'm thinking that the next small step in breathing life into AW3 is account creation. And yes, you will find that I like working on aspects which cover many parts of the code. Development then remains test-centric. Though it is good occasionally to shift gears, review and complete a particular component or sub-system. That makes it easier for everyone to understand. Completeness is always helpful, and it allows you to stamp "stable" on parts of the project.

For example, TKS and all its components are pretty stable for now. I don't expect any changes until we start scalling this up for multi-processor support. (And I hope that by then that "we" really means more than just me!)

Meanwhile, all the code is checked in under svn at SourceForge.

Bill

Monday, February 27, 2006

inverted tables

Had an easy day today and feeling much better.

Made some progress on the ARK API. Filled in a few gaps, like order. Added descriptor entries. Added a log of transactions for each cabinet. Stopped inverting name--its now just a classifier entry like usage and restrict.

I also made a small change to TKS, using inverted properties instead of inverted tables. The problem is that the inversion of a table could be quite large. And that causes problems. This should scale much better.

What are tables again? Think of them as table instances of table classes. Roles are built from multiple table instances, each table entry having a key and optional value. Because these tables are generally of reasonable size they cache nicely, so we should get great performance.

But when I had inverted tables [instances], each instance was tied to a key, and the entries were the role id and the optional value. And since many roles could potentially use the same key for the same type of value, the inverted table could be large. Now its not a proplem, cause I'm no longer grouping the inverted items into tables.

Work remains on the Ark API. Need some methods to rearrange structure (add/remove/replace a parent). And I need to rework tags, add matches, imatches and itags. And then there's export (used to be called name) and includes. As well as wikiword recognition, references, citations and resolutions. Hmm. A week or two? That's not long for roughing out the API, but its a long time to go without testing.

Maybe I need to take a different approach. This is not test-centric. Though it was good to take some time to review the API.

Bill

Sunday, February 26, 2006

very sick

Went to the dentist yesterday. Within 3 hours I've got explosive stomic problems and fever. Rough night! Slept all day today, with code running round and round in my head--still feverish.

Bill

Saturday, February 25, 2006

Its like watching grass grow

I've started working my way through the Ark API. So far, noting very exciting--I'm just expanding on the javadocs.

It does give me a chance to write about a variety of things. My motivation is to make it attractive enough that some crazy programmers might actually start to use it. Hmm, it may be a while 'till that happens. Well then, at a minimum I figure it makes the code easier to understand.

Bill

Net5Beans and TortoiseSVN

I'm using TortoiseSVN as the subversion client and I've been refactoring with NetBeans. Unfortunately, file name changes and file moves done with NetBeans show up now as file deletes and creates on subversion, which means the history gets lost and all the deleted files stay in subversion, like forever.

But I also found that there is work on a subversion plugin for NetBeans. And THAT sounds like the right way to go. But what I've got is definatly good enough for now. Tortoise rocks!

Meanwhile, I've created a apps package under framework and a help package under that. Since an interactive application requires at least three files, I figured that would be a good way to group things.

Bill

svn change notifications for AgileWiki

For the truly dedicated, I've set up a mailing list for the svn change notifications.

See https://lists.sourceforge.net/lists/listinfo/compstrm-notifications

Bill

AgileWiki imported to SF subversion

I've completed the import of AgileWiki3 onto subversion at SourceForge.

To browse the subversion contents, please go to http://svn.sourceforge.net/viewcvs.cgi/compstrm/

Enjoy!

Bill

Subversion

I just finished reading the announcement from SourceForge that Subversion in now a production service. Cheers! Here's the links:

Complete service documentation is available at:http://sf.net/docs/E09/

Documentation is provided for supported clients at:
http://sf.net/docs/F06/ for the command-line SVN client
http://sf.net/docs/F07/ for TortoiseSVN

Friday, February 24, 2006

The help application!

Help is now interactive. This is done by creating an evaluator, HelpEvaluator, which evaluates user input and preserves session state. Really its pretty simple, and serves as a nice example of an interactive application.

It occurs to me that the order command should also be interactive. But it will be a while until I get there.

It also occured to me today that the Ark server can initially be single threaded. I'll just need to use the nio selector to handle multiple sessions. I've also got a fine collection of ideas on how to make the Ark scalable. Yes, I've been dreaming about Sun's T1 chip! But I think the Ark framework can safely remain single threaded for quite some time.

Now, having had my fun playing with interactive Help, it is time to work on flushing out the Ark API, mm?

Meanwhile, note that the jar files were included in alpha7. So if you've got Java5 and just want to play with it... :-)

Bill

Thursday, February 23, 2006

interactive help

It finally occured to me (ah, duh!) that help could/should be a bit interactive.

For example, you should be able to get the display of the document for a command without navigating there.

Definately a step up. And I can see the need for a small change to the ViewEnv would also be helpful for such interactivity. And a great piece of sample code for doing things which don't conveniently decompose into simple commands and displays. (And something newbies would be likely to use, too.)

--The enhancement would be to give displays access to application-specific context (session data).

Bill

slow morning

Seems like every week I have a slow day or two, usually before the weekend. I got up at a nice time this morning (5AM), but motivational levels are low.

I did add another method to the Cmd interface, cmdGroup, which needs to return the group that the command belongs to (classifier, journal, games), and updated the existing commands to provide an appropriate return value. This will be used by the help command, and for organizing topics which describe commands (as is currently done at AgileWiki.org).

I still want to document the various Ark interfaces, as well as flushing out the Ark API. Just don't expect too much today. :-)

Bill

Wednesday, February 22, 2006

TKS tables--the key to implementing a Rolonic DataBase

When I first envisioned TKS tables, which are just ordered key/value pairs in the value of a property on a node/record, I knew I could finally implement the Ark as a database with a Rolonic API. But I could not explain why this structure was so important. I was just talking with Norm, and to him the reason was obvious. And I agree with his answer.

The answer is that structure and stream (space and time) are equally important. Structures must then support ordering or it will be constrained in its utility--as is all too typical of everything we do today when we program.

Conversly, my big conceptual breakthrough today that it is so trivial to implement applications on top of the Ark API is simply because typical implementations are excessively complicated because they do not handle time (changes over time / streaming) well.

Bill

The Rolonic API

Today I was really struck by the thought that the API on the Ark really/definitively enables Rolonic programming. So I should give some effort to flushing out the API.

For me, rolonic programming is all about being able to handle genericly things that were always handled previously in application-specific ways. Knowing that data is ledger, classifier, descriptor or journal per say does not help. But having a generic API which handles state, inter-relationships, paramaterization and time significantly reduces an application's implementation, often to the point where the implementation simply vanishes and a generic role can be used instead. Of course, that makes it real hard to demonstrate how to implement an application. (Which is why I'm delighted with the idea of doing some interactive simple games.)

Bill

Application-specific commands

I've always wanted to be able to implement application-specific commands and I realized today that I was trying too hard--its really very easy, and especially now in AgileWiki3.

All you really need to do is to set a LedgerEntry key/value pair on a DescriptorUnit. (Previously this would have been a header.) Then in the applicability rule for the command (now part of the command interface), just check for an appropriate LedgerEntry value. Bingo! That's all that is needed.

I've also been thinking about how to implement an application and have decided to implement the (very simple) higher/lower game. Pick a number that I'm thinking of (0-9) and I'll say higher or lower. You get 3 guesses. Easy to implement and a nice example of how to implement an application. (All be it, a stateless one in this case.)

A second game could be tic-tac-toe. This game could be statefull and could also illustrate an application-specific command set.

I'm thinking it would be quite helpful to demonstrate how to program applications early in the development of AgileWiki3.

Bill

Tuesday, February 21, 2006

Its starting to get rather Arkish

Alpha6 is really getting there. We can list applicible topics and that means we have at least a crude form of applicitive context. Name resolution is not too far away.

One thing I did not do is include LSecs in the topics list. LSecs are a part of a topic, they are not a topic. Lets have a ls command for listing applicible LSecs. Of course, LSecs should still be used for name resolution. (Exception: when the parent of an LSec is an LSec, it is (at the moment) still included in the topics list. That should probably be changed.)

--LedgerSections (LSecs), Remarks and Transactions are all first-class user objects, but they are not full-blown roles/topics.

Right now the applicitive context is pretty primitive. Its is only ancestors and their children. But give it time, and it will get quite complex. So its good to start looking at this code now if you ever want to understand it. :-)

Bill

changing dependencies

I've enhanced Cachable to support changing dependencies. Then I fixed BTree to change dependencies when a node splits.

Meanwhile I've started working on the topics command. And what that really means is that I'm implementing ApplicativeContext and NameResolution. :-)

Bill

more fun with cache

Pathnames are now being cached. But unlike cabinetForRole, a role's pathname changes over time and the cached value can become invalidated for any number of reasons.

The advantage of caching pathnames comes when displaying lists of pathnames. And I suspect little advantage can be realized without a fairly large cache size. Fortunately pathnames are small, so we can have that large cache.

And yes, this is a prelude to caching topics.

Bill

Monday, February 20, 2006

a bit of a gem, a bit of fun--cache is now smarter

Usage and Restrict are now only keys in the cls- table, though they are well represented in the Ark's classifier interfaces.

I've been thinking about how to cache a role's topics, which requires dependency chaining. So I've enhanced the cache package to handle interdependencies--a cachable will not now be recycled so long as another cachable dependsOn it. OK well, that's as far as I got on topics.

But I have managed to enhance the btree package. The blocks for nodes and leaves are now marked as depending on the block holding their parent node. This means that preference will generally be given to keeping blocks holding higher-level nodes and recycling blocks holding lower-level nodes and leaves. Likely only a nominal speed increase, but increasingly important as the database grows in size.

Well it was fun to do, anyway. And in the process I managed to uncover another TKS bug or two. (When testing TKS, I always drop the blockSize down to 256 bytes and the block and table cache down to 10--this really pushes TKS to its limits. In contrast things are pretty sedate, though much faster, with a 32K blocksize.)

Bill

Sunday, February 19, 2006

When is a tag not a tag?

LSecs and Remarks are now part of a role's ledgers, rather than being treated as children of a role. But wholeness and partness are NOT symetrical! LSecs and Remarks still have parents. Well, perhaps we could say that they are special children.

Now I'm thinking about tags, and I realize that so far I only have one tag, restrict. And perhaps it should not be a tag. Do we really want to find all roles which have a restrict tag? I think not. So restrict should be a classifier entry, like color or usage. And these are NOT inverted, unlike tags.

So lets have a cls- table for usage and restrict, and get rid of the usg- table. And lets then not think too much about tags until we have some. :-) Then maybe I can get back to finishing the descriptor units for Etc and Home. I think I've had enough rosebuds!

Bill

a quiet afternoon

Usage is no longer inverted. What was I thinking? That would have encurred a huge overhead for very little benifit. (The problem being that there would be lots of roles with the same usage.)

I've reorganized the class structure, again. This time I used NetBeans to do it. This is a very sweet tool.

Wrote javadocs for each package. Pretty brief except for the attribution to Norm at the top level. Still, it could be helpful to a new developer. (One of my goals here is to make the code as developer-friendly as possible to encourage community building.)

I've been thinking about restricting tags to the cabinet level. A bit of work, but it should give better coherency. This was Norm's suggestion a while back, but I resisted at the time because there was only one calendar then. Now I expect that every cabinet will have its own calendar. This also lines up with a need to have cabinets which are private to specific user groups.

And then there are LSecs and Remarks. I had glommed them all together with Cabinets, Drawers, Folders and Pages, but I am thinking now that they need to be managed separately. LSecs and Remarks are content (ledger), while the others are not.

I've also been thinking about how to accelerate access to topics. I had learned that this was necessary in the previous release because of the repeatitiveness of wiki name resolution. (A document can have a whole bunch of wiki names on it and if this is slow, the display suffers.) My previous implementation was pretty shabby--caching was only valid for the duration of a single command. But if I implement inter-dependencies so I can trash the cache when it becomes invalid, I can build something much more effective.

Overall its been a quiet afternoon. Had a fine nap and feeling quite refreshed.

Bill

gathering rosebuds

I do not think that a headlong rush to implement all the necessary infrastructure is the best approach here. Its good to smell the flowers and pick the rose buds along the way.
  • When a topic header is displayed, the descriptor unit, if any, is verified.
  • The results of determining the cabinet for a given topic is saved in a hash map--important info that costs something to calculate. Retaining this accelerates the location of a topic's descriptor unit.
  • Descriptor units are now associated by their base name only. And they must be folders in the DescriptorUnits Drawer. By using the du name, descriptor units will not be so hard to handle when posting roles to another cabinet/ark.
  • Children are now listed by usage and name, rather than usage and pathname.
  • A topic's document is now displayed. Crudely, but it is displayed.
  • The standard descriptor units for cabinets have been reviewed and fixed up.

I'm also thinking that there are a number of commands I could implement now, like ccd, ccdu and t. Really, forward progress is almost pointless unless I take the time to fill in at least some of the gaps as I go.

Bill

Saturday, February 18, 2006

A whole bunch of Ark Interfaces

I've now created a bunch of interfaces to the Ark class, as a means of giving some structure to the API--related methods are in the same interface. (Methods are grouped rolonicly, mostly. So there are ledger interfaces, CmdLedgerArk and ViewLedgerArk, journal interfaces, etc.)

Its probably overkill at this point, but I expect the number of methods on Ark to keep growing. Only two of these interfaces are actually used, CmdArk and ViewArk, by the cmds and view packages. (CmdArk extends ViewArk.)

Tomorrow I hope to finish the code to generate the DescriptorUnits for the Home and Etc Cabinets. I also need to extend the default display to include a role's document--I'm currently generating LSec content which can not be viewed. :-( After that, I want to work on user registration.

Bill

Happy days are here again!

There were several bugs in TKS, and one in Ark that was particularly difficult to track down. But everything now runs fine even with a table cache size of 10. (Ark initialization is currently creating about 750 tables.)

I'll note that the first time you run the Ark, you always used to experience a long pause. Not with the new Ark! Already initialization is creating the default descriptor units for 2 cabinets and there is no noticable pause on startup. :-)

I'm still figuring about a 100X increase in speed.

Bill

continued problems with TKS

I fixed a number of bugs in TKS and it seemed to be working. Then I increased the number of tables being created in the initialization transaction of the Ark and got the same behavior--it only runs correctly with a large table cache.

I now plan to rework the API of TKS, separating properties from tables like I did in the Ark. This will simplify the code. Perhaps I will find something in the process.

Bill

Friday, February 17, 2006

TKS is not fine

I found an error in Ark--setDuName needed to call removeProperty before assigning a new descriptor unit name. Then I found a bug in TKS.removeProperty.

Still things were strange. Finally I realized that changing the cache size on tables fixed it--which likely means there is a refence count error in TKS. (The ark is already using more than 128 tables for testing, and I had intentionally set the cache size small so I would find bugs like this early. Hopefully it will not take too much longer to track this down.

On the bright side, I am now using the NetBeans debugger. Its quite nice.

Meanwhile, I've been thinking about the Ark's API. This is going to be big and messy unless I find some way to organize it. And its an important API--it embodies a whole new way to program applications. My thought then is to use interfaces to organize it, just as I did with Env.

It would make a lot of sense to have separate API modeled after the AgileWiki command groups, but further broken down into queries and updates. There would be an interface for structural queries and another which extends that which includes structural changes. Similarly for journal, ledger, classifier and descriptor. All the query APIs would then be rolled up into the ViewArk interface, which would be used by the view and view.text packages. And they would all be rolled up into a CmdArk interface for use by the cmds package. (The CmdArk interface would also extend a transactional interface.)

Bill

TKS is fine

I went back to TKS and checked for multiple value assignments in the same transaction. Its working fine. So its back to debugging the Ark--I've got some very strange behavior right now to track down.

NetBeans is definately cool. But I still need to learn the debugger--old habits die hard.

Bill

Thursday, February 16, 2006

A possible bug in the TKS database

Its unconfirmed, but it looks like when I set a table value twice for the same key in the same transaction, it is the first value which is used.

Why did I try this? I'm building up layers of methods and setting a default value at the lower level. Not efficient, but an easy design.

If this is a bug, then either I fix TKS or it should raise an exception. The latter would force efficient code, but could be difficult sometimes to accomodate.

Meanwhile I'm working my way through the generation of the basic descriptor units when a cabinet is created. I've just added a way to set the descriptor on a role and implemented ledger entry key/value pairs (previously headers) as well.

Bill

limited progress on aw3

Learning an IDE can be distracting. But I'm really enjoying editing with NetBeans--its nice, and gives pleanty of feedback on syntax.

I have changed the interfaces a tad in aw3. Print really did not belong on a generic portal interface, so I've created a TextPortal interface and moved it there. This interface then is used by the AgileWiki.view.text package, where Portal is cast to TextPortal. To make this work, I've also added a getPortal method to Env and the ViewEnv interface.

A tags API is now implemented in Ark, and when present, tags are listed in the default display of a role. (Given the TKS API, this was trivial.)

I've also started creating the default descriptor units, but progress there is a bit tedious.

Still no commands which actually do an update. I want to work on users first.

Bill

Wednesday, February 15, 2006

going sideways

I've updated the download page now to reference only 2.2.0.1 and AgileWiki3.

I also tried to use Creator but was unable to get it to accept generics. So I switched over to NetBeans. (I really prefer just an editor/javac/java, but gosh, IDEs should be mature by now. Time to learn a few new tricks.)

And with everything else going on, of course, I've made no progress on the code since returning from Raipur. :-(

Bill

Tuesday, February 14, 2006

bit of a crazy evening

Most Sun software products are free these days, tools, web servers and all, sans support of course. So I figured I'd go with Sun's Java Studio Creator as an IDE. It requires 1GB of RAM, but its a pretty nice tool. I had 256MB, so I went out and bought a GB. Then shopping (always necessary after a long trip). Took a long time to find an auto richshaw willing to go our way--and then only for Rs30!

Got home. Turned on the computer just now and realized I now have a 500MHz system instead of 1.5GHz--slow RAM. Well, but it boots faster. :-/

Perhaps tomorrow I'll finally get to work on AgileWiki.org? Meanwhile, I'm downloading creator--all 254MB of it.

Bill

AgileWiki falling back

It looks like 2.2.0.1 on Jan 8th was the last good release. I'll be falling back to that. Unfortunately, as content is not backward compatible, I'll also be falling back to content from that time as well. :-(

And speaking of incompatibilities, AgileWiki3 will be wholely incompatible with the current AgileWiki. But it will be some time until aw3 will be ready for use, so perhaps some migration strategies can be worked out. But the dump files are based on internal structures, which are quite different in aw3.

Bill

Monday, February 13, 2006

subversion delay

I've released AgileWiki3, alpha 1. But I'm going to wait until SourceForge supports subversion before puting it under code control.

SourceForge had a beta going on subversion, but the beta is now closed. They now plan production support for it later this month.

Bill

back with the goods

Just got in... 2 nights on the train!

I've been quite busy and unusually productive. Go to http://agilewiki.org/AgileWiki3/
and from there you can access the src or aw3api.

Not much working, but more than half of the architecture is in place. But also some nice changes at the application level. In particular, transactions are now fist-class user objects, with a usage of Tran.

I've got the beginnings of a command-line interface, DirectPortal. This is strictly data/view/control architecture, but with different portals (direct, remote, swing, html, etc.). The portal then selects a compatible view module. The Ark class serves as the data repository. Both displays (under view) and commands (under cmds) are extensible--I just use a .txt file to list the command names and classes. Oh, and a command is not applicable if the display it uses has not been implemented for the view chosen by the portal. And yes, this is a pretty open architecture, but I haven't yet proven it as I've only got a command line interface right now.

But note that in this implementation, there will be a scripting language. And applicitive context order will be completely under user control. Further, drawers/folders/pages and lsecs are no longer partitioned, you will be able to intermix them any way you want.

My hope now is to get this all under subversion this weekend.

Bill